Using AI as a “smart beginner” to quickly solve technical problems
Unless you’ve been living under a rock, you’ll have read an awful lot about Artificial Intelligence (AI) over the last few months: its capacity to change the world, and how it will soon become a huge part of our everyday lives. Alongside the positive changes, the potential negative consequences - everything from job losses and entrenched bias to (doomsday alert) mass extinction - seem faintly terrifying.
At Caution Your Blast Ltd (CYB) we share those concerns. We are currently engaged in an ongoing deep dive exploration into AI tools and practices, looking at how we can incorporate the best of them into our work, and deploy them in a progressive, ethical manner.
But on AI Appreciation Day, we want to highlight a positive use of AI, and show how AI is already helping CYB to achieve our stated aim of using digital as a force for good.
The project – building a new digital service for British nationals abroad
With our client the Foreign, Commonwealth and Development Office (FCDO), we recently found ourselves with an opportunity to use AI to help solve a technical problem quickly and unblock a team that was stuck.
We are in the middle of digitising a service that lets British nationals register the birth of their child abroad. Currently, customers have to print out and fill in a form and then post various evidence to the UK. Prior to downloading the form to fill out, there is a different, short online form that asks some questions and then provides a list of documents they need to post. We are combining both steps into one digital form that collects the required information about customers (to save them filling out a paper form) and also tells them what documents they need to provide as evidence.
The issue – the team hitting a roadblock
In order to build our new digital service, we need to understand the rules and logic that define which documents are required depending on the country where the child was born. It turns out that this is quite complicated. There are hundreds of countries, and we didn’t know if each country has the same list of required documents, or whether the list is different for some countries. For us to manually check every single combination of country and to manually check the list would take a long, long time. Naturally, we submitted a request to the client to retrieve the original logic of the online form, but we were told it could take several weeks to process our request. In the meantime, our team was likely to be unable to build out the logic required for our new process - a huge blow to a team that wants to progress quickly.
The solution - using CYB’s very own ChatGPT
This is where ChatGPT comes in, or rather, this is where our Slack integration with ChatGPT comes in. CYB, like many companies, is keen to understand the potential of Large Language Models (LLMs) like ChatGPT to impact our work - but we are also keen to ensure we don’t accidentally leak any proprietary information. So we have set up a bot in our Slack environment that lets our team use GPT 4 (the underlying model that ChatGPT uses) whilst retaining a central log of all interactions that we can use to ensure compliance, and to learn from each other’s experimentation.
The great advantage of this is we are able to use JenPT (as our bot is named, after our amazing developer Jen!) as a “smart beginner”. What we mean by “smart beginner” is that we use ChatGPT like a human assistant, or like an intern - as a source for researching, developing and iterating ideas via a collaborative conversation, bit by bit.
And it has to be done bit by bit - as amazing as ChatGPT is, if you ask it to do a whole complicated task in one go, it’s not likely to succeed. For example, LLMs like ChatGPT can’t really do planning, breaking a task up into smaller sub-tasks etc. However, ChatGPT is pretty good at writing Python code, which is what was needed at this stage of the project. And we find that one really good way to work with ChatGPT is for a person to own the “structure and plan” of a task, but to ask for ChatGPT to help with the individual tasks. If the answer given isn’t quite right, the human can “push” the AI a little bit to get the correct answer.
How JenPT helped – quickly iterating technical code
I asked JenPT to help me write a Python program that could interact with the online form and extract all the details of the document list depending on the user’s answers. JenPT gave us code that could run on our Google Workspace (all workspaces come with Google Collaborate, which lets you run Python code without any installation or setup). At each step of the way we were able to ask JenPT for some code, and then look at the results and use them to guide the next step of the process. This is an example of the “human working with smart beginner ChatGPT” process in practice.
For example, the first thing we did was figure out how to extract the list of countries that is inside the pulldown menu on the form’s first page, and how to get from that list of countries to the final document list.
This is what the page ( https://www.gov.uk/register-a-birth/y ) looks like:
1. We asked JenPT to help us with this:
2. JenPT gave us code that we could copy / paste into Google Collab to run.
3. When we ran it, no countries were displayed. After exchanging a few messages about possible issues, JenPT was able to suss out why that might be and offer a suggestion on how to fix it.
4. In this instance, there was no ‘dropdown’ ID. JenPT gave us code that just grabbed the first dropdown menu on the page.
5. Running this code on Google Collab, actually works! This is what it looks like when we run it.
You can see the country list being displayed at the bottom.
Naturally this was not the end of the process for us, and it took roughly three hours of going back and forth with JenPT to retrieve the list of documents for each country and to extract the underlying rules that we needed to build our service. Even when we got errors, we were able to pass them into JenPT and “talk” with it about why the errors happened and how we might fix them.
The result – we found the information we needed quickly
After just three hours of using JenPT, we knew that:
There are 226 countries in total
181 of them require the exact same set of “standard” documents
26 countries require a list of documents unique to that country
19 countries are unable to use the service (for various reasons)
This is exactly how we envisage using AI in our projects: using the technology to circumvent challenges and quickly - and accurately - solving technical problems. Compared with the blocker that was “wait several weeks for our client to find the underlying logic”, three hours spent working with ChatGPT was an excellent and productive use of our time.
Of course, once our client processed our request, we received the real logic definition from them and we swapped that out for our temporary solution. ChatGPT is no replacement for a formal requirements agreement, but this helped us significantly in getting moving at the start of the project.
Thanks ChatGPT - or more accurately, JenPT!