I've been working on my own project and I'd rather have my own AI assistant instead of relying on ChatGPT. I'm considering building a chatbot and meeting note-taker using AWS services like Lex, Lambda, S3, and CloudFront to host my own UI. Do you think this approach would be practical, and can I end up saving money compared to using ChatGPT?
5 Answers
The cool thing about AWS is it allows you to experiment easily. Spin up your services, test them out, and if it doesn't work out, you can tear them down. I've seen some great workshops on building bots with Lex and Bedrock—you might find those helpful for getting started!
Check out the Amplify AI kit! It has pre-built AI conversation components, so you can get a fully functional chatbot up and running in no time by following their documentation.
If you factor in your time and effort, it might actually be less expensive to use existing services like ChatGPT. They might seem costly because of token issues, but they’ve optimized to provide value at scale. AWS will require more initial development effort and may not yield a cost benefit for a personal project.
Lex is primarily for processing human language into API calls, so you might need other AWS tools like Bedrock or Transcribe for a complete solution. Just remember that ChatGPT leverages economies of scale that AWS can't match for small-scale projects, thus likely making it more cost-effective for rapid deployment.
When talking about cost, it's crucial to define 'cheaper'. With generative AI, you need something to generate responses, whether it's through APIs like Amazon Bedrock or running your own model on a GPU-equipped instance, which can get pricey. For example, I have a g4dn.2xlarge that costs me about $18 a day! If you host your front-end on AWS and connect it to a powerful home PC, that could save some bucks, but you also have to consider uptime and electricity costs.
You might also want to look at using ECS for hosting your app; that way, you're not incurring costs for constant uptime.