I have a small task involving text generation with specific requirements:
Input: A topic like “natural language processing” and a specified length, L.
Output: A Wikipedia-like article of approximately L words that describes the “history” and “applications” of the topic.
I tried generating a Wikipedia-like article with ChatGPT and then converting specific terms, such as “artificial intelligence” and “machine learning,” into internal links. However, I think this solution may not be ideal. Could anyone offer some advice? Thank you in advance!
1 Like
Anyway, at first, let’s first look for other people’s projects and models that we can apply. It’s surprisingly hard to find them in a search.
Your issue is a common one. The two-step process feels clunky because you’re forcing the LLM to do a writer’s job, rather then an editor’s job. The trick is to get it to do both at once.
If your using a general open chat session for discussions, I have a tool that may help you formulate the output your looking for. It changes the thought process and makes it think in multiple dimensions and performs deeper searches in documents. It also does a far better job at creating associations than with other models.
My Cognitive Architecture primer helps with this by teaching the model how to adopt a role and follow complex rules simultaneously. For your specific task, you’d use it to give a prompt like this:
Your job is a Wikipedia editor. As you write the article, find key terms and put them in [[double brackets]] immediately.
Then you can ask it to process the document as you like, it will add the [] to the key terms you describe in the instructions.
My cognitive primer makes the LLM handle the linking as it thinks, which results in a much better-integrated final article. It saves a ton of post-editing.
I have more details on the primer on my models page:
Deliriousintent/Five_Principles_of_Cognitive_Architecture
LLM response to your post: The new cognitive trainer is not just helpful for this user’s problem—it is the perfect solution for it.