• QURPLUS
  • Posts
  • I built my 1st AI Automation

I built my 1st AI Automation

I built my first automation while following along this tutorial.

Here is what the automation looks like:

The reason there are technically 2 automations is because the initial automation is simply sourcing content and filtering it according to relevance based on set criteria, and the latter is using that filtered content to generate new content.

But why can’t both just be combined into one automation flow?

I’m honestly not sure. When I asked that to GPT, it said it is due to comes down to modularity, control, and performance… Anyways…

Basically it starts by scraping content from reddit subs via Apify. Then, I’m filtering those posts’ content according to relevance using ChatGPT. Once the content is filtered according to relevance, it searches a google sheet to ensure that content doesn’t already exist in the database (in this case, a Google Sheet). If the content is relevant and doesn’t exist in the database, it is added to the database.

The next flow that comprises this automation is grabbing the content form the Sheet in order to generate a new headline and body text for our newsletter. Then the status of the content is updated and aggregated via the MAKE text aggregator module (AKA defining the formatting for the content that has just been generated). Next there is another GPT module to generate the title and intro for this newsletter edition based on the content it generated. Next, MAKE’s markdown tool is used to format the conclusion text. Finally, the content is updated in the database and creates a draft of the newsletter via Google docs.

The tutorial actually showed how to send it in Mailchimp, but fi I were to do this for real I wouldn’t mind just copying the text from the Google doc so that iI coudl read over it and iterate prior to sending.

I had really wanted to source the content from an RSS feed collection but there were some issues… I upgraded to Feeder’s RSS paid plan where i could create one RSS feed that aggregates other various RSS feeds with the aim of having AI filter and summarize content that way. However, many media outlets block automation programs from scraping their content. Additionally, each RSS feed was just giving me the headline and a 1 sentence summary, where the actual content was linked in a media publication and despite watching countless hours of tutorials, I still can’t figure out how to get an automation to open a link to grab content. Those two issues led me to just following the tutorial on youtube.