AI Wiping Out Software Engineers?

Since January 2023, the world seems to have changed dramatically, yet it seems that our lives have changed little. Disruptive technologies hit the senses so hard that people tend to overestimate their short-term effects and overlook their long-term impact. In any case, we can foresee that this AI breakthrough will bring huge changes to human life, and practitioners in almost all industries are trying to embrace this huge change, and the "omnipotence" of ChatGPT has made many people start to question whether we don't need software anymore. As a practitioner in the software industry, I was once very anxious, but calmly look at LLM (Large Language Modeling) and software, can be said to be two species, there is no replacement.

Will AI replace software?

So what do we make of the new thing called AI (LLM) for programmers? Although AI consists of code, I do not categorize it as Software. Software is code with program logic, which is characterized as Deterministic. AI's code, on the other hand, has no procedural logic, it's just black-box parameters, it relies on training rather than writing a program, and it's characterized as Probabilistic - a fundamental difference from software. Before the advent of AI, there were three species in the world: humans, software and the physical world (which contains all animals).

The first interactions to emerge, between humans and AI, such as products like ChatGPT, influence each other through language. In fact, with this layer of interaction, AI can already indirectly influence software and the physical world. For example, when you ask ChatGPT how to install a router, it tells you to do 1, 2, 3, 4, and then you act as the AI's hands and feet to influence the physical world. Another example is when you ask ChatGPT how to change the resolution of a Mac computer, it will tell you the specific steps, and then you will help the AI to click the mouse to complete the operation.

This seems a bit silly, what we want is to let AI work, not for AI work. So a bunch of people think about "Enable AI to Take Actions" this thing, and then there is ChatGPT Plugins this kind of product, as well as Microsoft released Windows 11, from the system layer to access Copilot. you can tell the AI need to adjust the resolution, not to follow the AI instructions to adjust the resolution. You can tell the AI that you need to adjust the resolution, instead of following the AI's instructions to adjust the resolution. This ability allows AI to interact with existing software, which in turn affects humans and the physical world. As for whether AI can directly manipulate the physical world, we haven't seen the finished product yet, but there are many robotics companies working on it.

In the form of interaction, ChatGPT brings us a brand new form of interaction -- ChatUI. There was a time when the industry was very hot on the discussion of ChatUI, even obsessed with it, thinking that Chat was going to rule the world, which was also the source of the question of "will AI replace software". ChatUI is very good for human intuition in many scenarios, but it can't solve all the problems, the future must be a variety of UI coexist.

Words are only one of the means by which human beings convey information, and there are many scenarios that words can't describe, but a gesture or a look can solve the problem, which can be simply categorized as a palette problem: you can pick the color you want in three seconds with your finger, but you can't describe in words that violet with a hint of blue.

In fact, there are many LLM applications in production systems where chat is not the primary interaction interface, such as GitHub Copilot, so let's look at ChatUI as a new thing in a cool and objective way.

The ideas of "big models eat everything" and "the end of programming" essentially mean that big models can do everything, and we don't need to write programs anymore, we just need to train models. As far as the principles and practices of LLM are concerned, it is impossible for big models to replace traditional programs.

You can compare the big model to the human brain and the traditional program to a calculator. Although the human brain can also do addition, subtraction, multiplication and division, can the human brain replace the calculator? We all know that the neural network simulation is the human brain, although we can not yet fully testify to the extent of this simulation, but at least the starting point is to simulate the human brain, it is reasonable to say that the human brain has the shortcomings of the big model will also have. The current practice has also proved that the big model is not good at calculation, can not accurately access the information, there is randomness, these are precisely the weakness of the human brain, but it is exactly the strength of the traditional program.

Will AI replace programmer jobs?

To answer this question, we have to figure out what AI brings - AI is an intellectual revolution, a replacement for intelligence. The Industrial Revolution reduced the agricultural population of the UK from 60% to 10%, and the Information Revolution reduced the industrial population of the US from 40% to 8%. Along these lines, if AI is an intellectual revolution, the share of white collar workers in the job market will go from 60%+ to single digits. From this perspective, AI will indeed replace programmer jobs in the long run.

If AI can replace people, that means it replaces a factor of production. The impact on productivity would be huge, unleashing more human creativity, eliminating old jobs and creating new ones, with a huge impact on everyone's lives.

The intelligence level of GPT-4 is already quite high, and GPT-5 may surpass 80% of human intelligence. In this context, the question becomes how to let AI really replace a certain type of work. However, at present, AI technology is still more inclined to be a helper than a driver. The products on the market that are completely built by AI are still at the toy stage. The auxiliary AI assistant is more mature, such as GitHub Copilot, such a tool can not replace the programmer, but only as a gain of production tools, can not replace the productivity itself.

DSL-ization of work content

When we talk about LLMs working in industry in place of people, in addition to dealing with people, they often have to interact with industry-specific knowledge, data, and systems. There are two ways to instill industry knowledge in LLMs, one is Fine Tuning and the other is Prompt Engineering, and as far as the actual industry development is concerned, Fine Tuning has not yet formed a consensus, and the cost is very high, and a large number of the current applications are based on Prompt Engineering. --The world's most widely used model, GPT-4, does not offer Fine Tuning as an option.

Many issues that were once seen as critical, such as segregation of duties in software development, multi-language programming, complex frameworks, and human-computer interaction, may not be as important now. Instead, some previously overlooked capabilities, such as open APIs, are now gaining in importance.

As a result, we need to revisit our tools and methods. Tools that seem excellent and important may not always be appropriate for AI. In order for AI to produce and consume more effectively, we need to rebuild the tools for AI, rather than simply handing human tools over to AI on.

This means that all industries need to start thinking about how to build tools for AI that are more suitable for its use. Only then will AI be able to produce and consume more conveniently and better replace human work. This is not only a technical challenge, but also a mindset shift.

Trend-Tech 2024-01-18 06:31

POSTS

2024-02-02
Light: Release Yourself, The Flickering Brushstrokes Just Glow In The Darkness of The Night

Woolf's words are like a beam of light flickering in the darkness of the night, illuminating the corners of our hearts and inspiring us to think about ourselves and the world.