Combinatorial optimization problems are difficult problems with a limited number of solutions. The travelling salesman, bin-packing, and job-shop scheduling issues are some of the most well-known examples of these issues.
Researchers at the Amazon Quantum Solutions Lab, which is part of the AWS Intelligent and Advanced Computer Technologies Labs, have just developed a new tool based on graph neural networks to solve combinatorial optimization challenges (GNNs). The method created by Schuetz, Brubaker, and Katzgraber, which was published in Nature Machine Intelligence, might be applied to a wide range of real-world challenges.
Also Read: Key Aspects Defining Metaverse Technologies
Martin Schuetz, one of the study’s researchers, told TechXplore, “Our work was very much influenced by client needs.” “We contact with many customers across various verticals on their quest to become quantum-ready, or prepare for a future when this new technology will be commercially viable, in our everyday work at the Amazon Quantum Solutions Lab. The majority of customer use cases involve difficulties involving combinatorial optimization.”
Combinatorial optimization problems can take many different shapes in the context of consumer services. Portfolio optimization difficulties in finance and job-shop scheduling tasks in manufacturing are two famous instances of these issues. The act of selecting the best portfolio or asset distribution for a certain situation from a group of accessible portfolios is known as portfolio optimization.
Job-shop scheduling issues, on the other hand, arise when a set of jobs or tasks must be completed but there are only a limited number of resources/tools available to do so. In these situations, one can be requested to create an optimal timetable that makes use of existing technologies to complete tasks in the shortest amount of time possible.
Because quantum technology is still in its infancy, academics have been working on optimization algorithms that don’t rely entirely on quantum computers, at least until these computers become commercially feasible. Schuetz and his colleagues presented an optimization strategy based on GNNs that was influenced by physics in their study.
“Physics-inspired GNNs can be utilised today to approximation solve (large-scale) combinatorial optimization problems with quantum-native models,” Brubaker stated.
The method established by Schuetz and his colleagues begins by identifying the Hamiltonian (cost function) that encodes the specific optimization problems being addressed. The related decision variables are then associated with nodes in a graph.
“Our core idea is to structure combinatorial optimization issues as unsupervised node classification tasks,” Schuetz noted, “where the GNN learns colour (or spin or variable) assignments for each node.” “To this purpose, the GNN is iteratively trained using a bespoke loss function that encapsulates the specific optimization problem of interest in a one-to-one correspondence with the original Hamiltonian, thus providing a principled loss function for the GNN.”
The team projected the final values for the soft node assignments it produced to hard class assignments once the GNN was trained. They were able to approximation solve large-scale combinatorial optimization problems of interest as a result of this.
The method presented by Schuetz and colleagues provides a number of advantages over other approaches to combinatorial optimization issues. Their technology is particularly scalable, which implies it might be used to computationally optimise huge issues involving hundreds of millions of nodes.
“Our GNN optimizer is based on a direct and universal mathematical relationship between prototype Ising spin Hamiltonians and the differentiable loss function with which we train the GNN,” Brubaker stated. “We present a simple, general, and robust solver that does not rely on handcrafted loss functions by combining notions from physics with modern machine learning technology.”
Surprisingly, the method developed by Schuetz and colleagues can approximation solve optimization problems without the use of training labels, which are a prerequisite for all supervised learning approaches. The approach can operate natively on quantum technology because it casts optimization problems as Ising Hamiltonians.
“We present a unified, interdisciplinary framework for optimization issues that includes physics ideas and contemporary deep learning methods,” Schuetz said. “We now have a tool at our disposal that is broadly applicable to typical NP-hard problems, such as maximum cut, minimum vertex cover, maximum independent set problems, and Ising spin glasses,” says the author.
This team of researchers’ novel GNN-based technology could be utilised to solve a range of complicated real-world optimization challenges in the future. The Amazon Quantum Solutions Lab and AWS want to give it to its clients as a tool to help them move to quantum technology because it is inherently scalable. In fact, this method could allow customers to employ a quantum-native modelling framework to solve problems connected to specific use cases on small and large sizes.
“We will continue to investigate conceptual, theoretical, and more applied research problems in the future. On the one hand, we have various suggestions for how to improve and extend the capabilities of the suggested GNN optimizer, and on the other hand, there are many practical use cases that this new tool can be used to tackle. Customer feedback will continue to be used to steer and prioritise our research programme “According to Katzgraber.