Friday, December 30, 2022

Who does ChatGPT actually disrupt?

 


The most exciting new tool this year is ChatGPT, and I've been playing with it quite a bit. While I've not seen an actual use of the application in the corporate world, I can imagine how powerful it can be if it gets incorporated into a search engine. 

The most potent usage of ChatGPT is in software engineering. I have tried describing to the engine to write a RESTful API in Python to perform some operations, and it was able to generate code for me and even walk me through how the code works. I imagine a programmer using ChatGPT to build subroutines and increase productivity by 200% -300%.   For finance, it can explain basic concepts but cannot make actual stock recommendations based on the current market situation. ChatGPT is still not useful for lawyering yet. The best I can manage is to get it to explain complex terms like renvoi, but it's not sufficient to get any work done. 

In about 3-5 years, specialised GPT AIs covering domains like finance and law will become a reality, and professionals in all fields will experience a massive boost in productivity. I can now imagine describing a factual situation to a search engine, and the system will point me to the possible cases or statutes on which I can base my research. The system cannot replace the human touch we bring to give clients advice, so I expect to earn more revenue in the future as smaller firms can scale faster.

But one of my observations is that technologists don't panic about ChatGPT. It could be because technologists eat disruption for breakfast. ( Technologists also cut and paste from stack overflow )  

The negativity comes from academics who work in the humanities and are worried about cheating in essays. Maybe there's some fear that the work of a GPT AI is starting to get a B- grade, which can be better than the crap the non-honours humanities graduates produce. The answer to ChatGPT is simple, just make sure you run the essay question through ChatGPT and ensure that all students do the same before starting their homework. To pass, they need to add a human touch and some novelty to what the AI produces. 

In fact, the ability to critique a ChatGPT essay and provide alternatives to the essay question should be a future skill graduates should have. 

Finally, if you've noted what our leadership has been talking about, I see some work to improve the rewards of blue-collar or hands-on work. The question is whether new AI technologies will widen or bridge the income gap between white and blue-collar workers. 

The opposite will occur.

I think the gap between professionals and blue-collar workers will widen. With better productivity, the gap in salary between professional and general degrees will widen even further. This is something that our government should take note of. Suppose a cluster of liberally educated local graduates cannot find well-paying jobs. In that case, they will form an alternative voice and have every incentive to topple everything we've built. Just wait for Epigram to have a book fair and measure the size of the crowds browsing those books to get a feel of how mutinous the young people have become. 

MOE is wise to shut down the Yale-NUS Liberal Art program, but they should take painful steps to limit seats in the Humanities programs to have a small elite intake that will get assured jobs in the civil or uniformed services. 

Those who don't make the grade are better off picking up a plumbing certificate and emigrating to Australia.

[The art accompanying this article is done by Dall-E using the search terms "Paladin in Hell". It was generated in seconds, and I no longer have to deal with talented but unreliable artistic types.  If I ever publish role-paying rules again, I no longer have to worry about splitting my revenue with a low conscientiousness flake. ]






 

2 comments:

  1. I need to play with ChatGPT.

    I think it can do the work of a hundred dumb people. But not the work of one smart person where exactness, inquisitiveness, or creativity are required. As an analogy, it won't help a programmer reverse engineer or debug a complex application, or write requirements for a specialised system. But it could write a boilerplate CRUD application (maybe even some of the UI and requirements for it).

    ReplyDelete