Combinatorial Optimization with LLMs/Transformers

I am curious whether a well-designed Transformer can do something like a job-shop-scheduling problem (JSSP) at the high level as GA and other heuristical approaches.

The logic I am coming from is that words are sequences, and JSSP can be transformed into a sequence of tasks no matter what the precedence graph looks like. And final solution would be set of tasks, as LLM makes a set of words that make a story…

I did find some literature on this, but problems are usually very small - like few dozens of tasks with very simple/streamlined rules.

1 Like

Yes I’d be very interested in this as well :slight_smile:

1 Like

Does the data in JSSP scale up now, like millions pieces of job shop schedules?

I’m interested in LLM4CO too! Could you share the literature about the topic please ?

me too. Here are some related papers found recently. But I am doubting about the promissing performance since LLMs are not that controllable:

  1. [2310.19046] Large Language Models as Evolutionary Optimizers
  2. (ICLR24-Google DeepMind) [2309.03409] Large Language Models as Optimizers
1 Like

Check this updating list (GitHub - FeiLiu36/LLM4Opt: A Collection on Large Language Models for Optimization) on LLM4Opt including combinatorial optimization and other related works
Here is an ICML Oral paper on LLM4CO (GitHub - FeiLiu36/EoH: Evolution of Heuristics)