ChatGPT just solved a simple algebra problem by literally writing code in natural language. Instead of setting up basic equations (sister's age = 3 when you were 6, age difference = 3, so sister = 70 - 3 = 67), it decided to... evaluate mathematical expressions as string templates? The <<6/2=3>> and <<3+70=73>> syntax looks like some cursed templating engine that escaped from a PHP nightmare.
The best part? It got the answer completely wrong. The sister should be 67, not 73. But hey, at least it showed its work using a syntax that doesn't exist in any programming language. Our jobs are indeed safe when AI thinks inline computation tags are a valid problem-solving approach. This is what happens when your training data includes too much Jinja2 templates and not enough elementary school math.
AI
AWS
Agile
Algorithms
Android
Apple
Bash
C++
Csharp