Update: don't read this, read much more informed people than me.

This ChatGPT thing is pretty crazy, hey. I've got to say that as a software engineer with a healthy dose of imposter syndrome it makes me a little nervous. Are they going to take our jobs?

Throughout history, people have spruiked a future where everything is automated and we have more leisure time. Of course we've all been around the block enough times to know that it never quite works out like that. If business can get machines to do your job then they'll just take the profits and run... but I digress.

To me it seems like ChatGPT will be integrated into our IDEs pretty soon (if it isn't already). That's not necessarily a bad thing, we should all be trying to write less code. I can imagine a little prompt saying "would you like me to optimise this code for you?" or highlighting a chunk of text and telling ChatGPT to "allow this function to handle multiple timezones".

My worry about that is how authoritatively it presents itself. It's very sure of itself. It might not necessarily be right and I suspect that ChatGPT would struggle with the human/social aspects of writing code that are so critical.

It's also often wrong, as Stackoverflow have pointed out. And it's necessarily historical - it can only learn from what has already happened. How quickly would it catch up to a newly discovered security issue or would it just keep importing log4j? Would it favour OOP over Functional Programming?

My other worry concerns teaching new/junior engineers how to write code. I can imagine a world where all day-today code is written by ChatGPT. I can't imagine it taking in a large amount of business logic and structuring an application that accurately encapsulates a domain. So how does one bridge that gap? How do you become a Software Architect (or other senior) if you've never had to write much code?

Interesting times.