Suggestions for Long-Term Email Processing Automation

0
4
Asked By TechieTurtle123 On

I'm looking for some advice on setting up a long-term email processing automation project I've been working on at my job, where I spend about 20% of my time. The goal is to create a tool that pulls emails from a mail server, scans their content, headers, and attachments for a specific 10-digit project identifier, and checks if the senders belong to certain domains. If a relevant identifier is found, the email and its attachments get uploaded to a cloud storage system, making them accessible to other team members.

Currently, I'm considering using a staging folder for downloading emails, and a separate temporary folder for processed emails before they get uploaded. I prefer to keep these processes asynchronous but I'm uncertain about the best way to achieve this. I have some experience with subprocesses and I've also been exploring asyncio.

I'm planning to schedule the email download via cron jobs, followed by another service that processes, uploads, cleans up files, and updates the email status on the server. I'd greatly appreciate any insights or suggestions, especially from those who have experience with similar projects. Thanks for your help!

5 Answers

Answered By CuriousCompliance On

In Europe, you need to be cautious about how you handle emails due to strict data protection laws. Make sure you’re compliant if you're scanning emails for specific keywords or personal data.

BobTheBuilder -

That’s a good point! I’m focusing on project-related identifiers, but I’d like to know more about what's permitted for IT teams in terms of email processing.

Answered By PowerAutoGenius On

Have you considered using Power Automate for the workflow? It's a straightforward option for handling tasks like downloading and processing emails, and it might save you time as it integrates nicely with the Graph API, which can be beneficial in a corporate environment.

Answered By ItConsultantMike On

Before you dive too deep, ensure you have authorization and that your mail server is compatible with the tools you're planning to use. If you don’t, it’s vital to consult with your IT department first to avoid any complications.

BobTheBuilder -

I'm all set with app-level permissions for Microsoft Graph API, and the performance has been solid. I can process emails quickly.

Answered By DevGuru42 On

You might want to look into using an API to directly process emails without having to download them. If you're using a service like Google, their API lets you fetch emails directly, which could save you time and avoid needing redundant storage. But I get it, your company's emails aren't in Google, so you’ll need to keep that part in mind.

BobTheBuilder -

Yeah, I should've clarified earlier - I'm using Microsoft Graph API for this, so I've got that piece sorted. I’m really just trying to nail down the orchestration of the scripts.

Answered By FileWiseAndy On

It's definitely a good idea to avoid keeping all files in one directory. I once worked with a system that did that, and we ended up hitting file handle limits. Instead, try implementing a structured folder hierarchy, perhaps by year and project number. This way, you'll prevent those indexing issues and have a backup on-premises, in case cloud access is ever disrupted.

BobTheBuilder -

Currently, I'm moving emails from a 'downloaded' folder to a 'processed' folder after handling them, which seems to avoid duplication issues. But I like the idea of using a date hierarchy. I’m trying to keep all files temporary for security reasons, so I won’t have them around for long.

Related Questions

LEAVE A REPLY

Please enter your comment!
Please enter your name here

This site uses Akismet to reduce spam. Learn how your comment data is processed.