Digital transformation, higher education, innovation, technology, professional skills, management, and strategy

Rose-tinted predictions for artificial intelligence’s grand achievements will be swept aside by underwhelming performance and dangerous results.

The year 2023 saw widespread hype around generative AI, with expectations of AI-powered advancements becoming common. However, in 2024, there will be a need to readjust expectations as evidence emerges showcasing the limitations of generative AI. These models are prone to providing false information and hallucinations, making it difficult to anchor predictions to known truths. Anticipation of exponential productivity improvements and artificial general intelligence will prove unfounded, leading to a shift in focus towards identifying human tasks that can be augmented by these models. Moreover, the adoption of generative AI will result in job displacement without significant productivity gains. ChatGPT and similar models will dominate social media and online search, leading to increased manipulation, misinformation, and mental health issues. The industry will experience a duopoly with Google and Microsoft/OpenAI as dominant players, sparking calls for antitrust and regulation. However, meaningful regulation is not likely to arrive until the US government catches up with the technology in subsequent years.


Leave a Reply

Your email address will not be published. Required fields are marked *

About Me

Digital transformation, including agile and devops, across many industries, most recently in higher education. Designed and built the Emory faculty information system. Working in continuing education to improve and expand career-focused learning, esp. in workforce development. Expanding the role of innovation and entrepreneurship. Designed, built, and launched the Emory Center for Innovation.

Favorite sites

  • Daring Fireball

Favorite podcasts

  • Manager Tools



Discover more from Polymathic

Subscribe now to keep reading and get access to the full archive.

Continue reading