I assigned ChatGPT to create a simple app that helps paramedics interact with their directives from a PDF. It promised to deliver results within 24-48 hours, but now it's been a week with no outputs. It keeps missing deadlines and asking for more time with no valid reasons. Shouldn't it learn from these mistakes? If a human were in this position, they'd likely be fired. Is this typical behavior for AI?
2 Answers
Yeah, it's pretty normal. Just keep the communication going about your project. Don't let it claim it needs tons of time. If it does, request a breakdown of what it's accomplished so far. You might also want to try Gemini 2.5 Pro—it's got some impressive capabilities. Keep pressing it for updates; it'll likely get back to work on your code.
Agreed! Regular updates are key. Don’t let it slack off.
I'm not entirely clear on what CGPT means in your context, but if you're using ChatGPT, it typically has a time constraint of around 60 seconds. For the O3 version, responses might take longer, but a 1-2 day delay for a simple task seems off. If you're using a specific custom GPT, that could be part of the problem.
Yeah, I'm using the paid version. It should definitely be able to handle a longer task.
Good to know. I’ll reconsider my setup then!
Thanks for the tip! I'll definitely keep on it and see if that helps.