Skip to content
Home>All Insights>Highlights from Softwire’s Productionising GPT event

Highlights from Softwire’s Productionising GPT event

This month we held an event in Manchester exploring practical applications of generative AI.

The timing was significant: about one year earlier we’d held our last event exploring generative AI, and six weeks after that OpenAI released ChatGPT to the world. In 2022 only a handful of our audience had any experience with generative AI. This year, most of our audience said they use generative AI regularly.

So a lot has changed in a year, and how we use generative AI in production has changed too – it’s become more sophisticated, more refined, more than just chatbots.

It continues to develop, but there’s plenty of practical advice worth sharing as we reach the end of 2023, and that was the theme of this month’s event. In “Productionising GPTs”, Jon Artus explored a real-world application of large language models (LLMs) in financial services, explained some of the practical mechanics behind LLMs, and shared lessons on making the most of them.

In “LLM access patterns”, Ant Pengelly explored the innovations not only in LLMs themselves, but in how developers are accessing them. He used a practical demo to show how different access patterns can take LLMs in directions you might not expect.

Key takeaways from “Productionising GPTs”

  • First, “prompt engineering” – the art of asking good questions to your LLM – is important. Just like humans, LLMs need context to make decisions, and passing context via good prompts can make a measurable difference to the accuracy of your outputs.
  • If accuracy is really important to you, you can supplement prompts to your general LLM with specific factual data. If your LLM is classifying taxes, for example, your prompts can include curated advice from tax-experts. As a neat side-effect, including such facts in your prompts also reduces the risk of your LLM “hallucinating” facts instead.
  • Even with good prompt engineering, supplemental data, and even “fine-tuning”, these models aren’t going to be perfect. For use-cases where accuracy is essential, it’s still best to think of generative AI as a co-pilot. You should plan your UX as such.
Jon Artus (Client Director, Softwire) presenting 'Productionising GPT

Jon Artus, Client Director at Softwire, presenting ‘Productionising GPT’

Key takeaways from “LLM access patterns”

  • Ant argued that since the release of ChatGPT last year, the biggest innovations in LLMs have been not in the LLMs themselves, but in how we access them.
  • For example, prompt engineering can improve the accuracy of LLMs through human interventions. But it can go further with the intervention of yet more generative AI. “Chaining”, for example, is when the output from one LLM is plugged into the prompt for another. It can be used for simple processing (eg translation, removing sensitive content), but it can get much more sophisticated (eg LLMs managing other LLMs, if that’s something you’re comfortable with).
  • Finally, you can prototype this stuff very quickly. Ant’s practical demo, “EduTailor” – an app that generates personalised educational courses on any topic – looks like a proof-of-concept for a startup. He built it in just a couple of afternoons, and said one of the most time-consuming tasks was not the coding itself, but finding a large and free source of educational facts (credit: revisionworld.com).
Anthony Pengelly, Technical Principal at Softwire, presenting 'LLM access patterns'

Anthony Pengelly, Technical Principal at Softwire, presenting ‘LLM access patterns’

Well done to both speakers for giving live demos (that worked) and for making some pretty esoteric concepts feel real and accessible. Special praise to Jon for maintaining humour and interest in a half-hour talk about tax and multi-dimensional vectors.

That’s the last of our Manchester events for 2023. We’re considering doing a deep dive into some of the concepts above – if we do, we’ll link those articles here. We’re also thinking about our events for 2024, so if there’s a topic you’d like us to consider, do drop a message to [email protected].


Digital Engineering

Get expert help with your digital challenges and unlock modern digital engineering solutions.