3aIT Blog

A robot hand with an abstract backgroundIt's now almost a couple of years since we wrote our first in-depth article about the rise of AI and the impact it may have on 3aIT and our clients. So, how are things looking as we get stuck into 2025? Are any of these new tools actually useful?

Even for people that don't pay much attention to this stuff, the constant drumbeat of "AI in everything" is impossible to miss now. If you have a phone that's new enough to still be receiving updates, you've no doubt been bombarded by notifications about new AI features. Most apps are introducing AI features. Various bits of new hardware are now being sold based on their AI smarts.

The trouble with all this is it's not entirely clear what this all means to the average person. AI is basically being used as a synonym for "clever stuff", which although sells the idea that this new thing is better, doesn't really make it clear how.

AI with everything

The reality is that the way it's being used is often massively different in every case, and works to varying degrees in each one. The problem here being that if you're presented with an example of AI being crowbarred in somewhere it isn't really useful, it's easy to then come away from that with the impression that "AI is rubbish", in turn dismissing other applications of it that have been well implemented.

The other issue here that is causing people to try these tools and bounce off them in some cases is the lack of expectation management. As it is has been talked about in terms of completely replacing swathes of people, the implication there is that you can just set it off doing whatever that person did, and it will come back with a perfect version of whatever that is. However, in our experience, it gets about 90% there (or is right about 90% of the time, to put it another way). 

Our own experience

Code on 2 screensTo take an example from our own experience, we have been trialling using AI to assist us with coding on the development side of the buisness. There is an initial learning curve using these tools as you work out if and how they can make things easier or quicker for you. So what does "90% there" mean in this case? Well, it can mean a couple of things. Sometimes it will do stuff that just doesn't work (as in you run it and get an error). That's annoying, but at least it's obvious. 

The other bit of it is more subtle. In many ways, these text-based generative tools are a lot like getting infinite wishes from a genie. They will give you whatever you ask for, even if the better thing to do would be to say "There's 20 ways to do that thing. Which one would you prefer?" or "Are you sure that's what you want?".This can lead to situations where the thing its done works fine on the face of it, but it's been done in a way that is storing up problems for the future. 

Does that mean it's useless to us? Absolutely not. However, it does mean that it still very much needs people with the experience to know what to do when those "10% things" happen.This is especially true when it's done something that works but not in a good way. There's also situations where you still pretty much have to do the whole thing yourself, albeit with some helpful suggestions on what you might be able to type next adding minor efficiencies.

This does pose an interesting dilemma as far as new developers coming to these tools with no experience. If you don't know how to build whatever you're building yourself, how are you going to know if it's doing the right thing the wrong way? How are you going to catch the 10% if you don't know the 100%? It works, so move on to the next thing. It will be months down the line when the client asks for a change that you realise that you can't without rebuilding whole chunks of the code, or that you've ended up with a combination of solutions that massively slow down a system that could have been done in a much more efficient way if you understood that was what was happening.

Proceed with caution

A wait signal at a pedestrian crossingIt should be said that coding is something that these AI tools are particularly suited to, so our experience of them will definitely be at the more positive end of the scale of helpfulness. Even so, the other issue with the "10% problem" is when these tools are left to their own devices. This leads to companies having to publicly withdraw them, as Apple has recently found with its tool that created news summaries which misinterpreted some high profile stories, getting the facts wrong. Naturally, journalists pounced on this - it proves they can never be replaced, right?

This probably gets to the heart of the conflicting approaches here, and as is almost always the case, the likely best path is somewhere between the extremes that get the publicity. On one side, you have some businesses trying to completely replace people with these tools, and they're ending up frustrated or embarrased. On the other, you have the people refusing to engage with this and insisting what they do can never be reproduced by AI, hoping if they make enough noise, maybe this will all go away.

If you're a plumber, there probably aren't many AI tools that can help you with your job (at least for now). However, for anyone that does a job that involves using a computer most of the time, it's likely that at least some of what you do can be aided with these tools, and "aided" is the important word there. It may well not get 90% there in your case. For example. maybe the best application could be something like tiding up the formatting of a document you've written. For now, at least, our opinion is that is it very much a matter of working with these tools rather than hoping they can do your or someone else's job in its entirety. Do not expect perfect results. Expect something that you'll need to finesse, and that sometimes, you'll need to step in and do it yourself. That doesn't work so well in the marketing spiel though, does it?