1670062320
I’m a huge fan of automation when the scenario allows for it. Maybe you need to keep track of guest information when they RSVP to your event, or maybe you need to monitor and react to feeds of data. These are two of many possible scenarios where you probably wouldn’t want to do things manually.
There are quite a few tools that are designed to automate your life. Some of the popular tools include IFTTT, Zapier, and Automate. The idea behind these services is that given a trigger, you can do a series of events.
In this tutorial, we’re going to see how to collect Twitter data with Zapier, store it in MongoDB using a Realm webhook function, and then run aggregations on it using the MongoDB query language (MQL).
There are a few requirements that must be met prior to starting this tutorial:
There is a Zapier free tier, but because we plan to use webhooks, which are premium in Zapier, a paid account is necessary. To consume data from Twitter in Zapier, a Twitter account is necessary, even if we plan to consume data that isn’t related to our account. This data will be stored in MongoDB, so a cluster with properly configured IP access and user permissions is required.
You can get started with MongoDB Atlas by launching a free M0 cluster, no credit card required.
While not necessary to create a database and collection prior to use, we’ll be using a zapier database and a tweets collection throughout the scope of this tutorial.
Since the plan is to store tweets from Twitter within MongoDB and then create queries to make sense of it, we should probably get an understanding of the data prior to trying to work with it.
We’ll be using the “Search Mention” functionality within Zapier for Twitter. Essentially, it allows us to provide a Twitter query and trigger an automation when the data is found. More on that soon.
As a result, we’ll end up with the following raw data:
{
"created_at": "Tue Feb 02 20:31:58 +0000 2021",
"id": "1356701917603238000",
"id_str": "1356701917603237888",
"full_text": "In case anyone is interested in learning about how to work with streaming data using Node.js, I wrote a tutorial about it on the @MongoDB Developer Hub. https://t.co/Dxt80lD8xj #javascript",
"truncated": false,
"display_text_range": [0, 188],
"metadata": {
"iso_language_code": "en",
"result_type": "recent"
},
"source": "<a href='https://about.twitter.com/products/tweetdeck' rel='nofollow'>TweetDeck</a>",
"in_reply_to_status_id": null,
"in_reply_to_status_id_str": null,
"in_reply_to_user_id": null,
"in_reply_to_user_id_str": null,
"in_reply_to_screen_name": null,
"user": {
"id": "227546834",
"id_str": "227546834",
"name": "Nic Raboy",
"screen_name": "nraboy",
"location": "Tracy, CA",
"description": "Advocate of modern web and mobile development technologies. I write tutorials and speak at events to make app development easier to understand. I work @MongoDB.",
"url": "https://t.co/mRqzaKrmvm",
"entities": {
"url": {
"urls": [
{
"url": "https://t.co/mRqzaKrmvm",
"expanded_url": "https://www.thepolyglotdeveloper.com",
"display_url": "thepolyglotdeveloper.com",
"indices": [0, 23]
}
]
},
"description": {
"urls": ""
}
},
"protected": false,
"followers_count": 4599,
"friends_count": 551,
"listed_count": 265,
"created_at": "Fri Dec 17 03:33:03 +0000 2010",
"favourites_count": 4550,
"verified": false
},
"lang": "en",
"url": "https://twitter.com/227546834/status/1356701917603237888",
"text": "In case anyone is interested in learning about how to work with streaming data using Node.js, I wrote a tutorial about it on the @MongoDB Developer Hub. https://t.co/Dxt80lD8xj #javascript"
}
The data we have access to is probably more than we need. However, it really depends on what you’re interested in. For this example, we’ll be storing the following within MongoDB:
{
"created_at": "Tue Feb 02 20:31:58 +0000 2021",
"user": {
"screen_name": "nraboy",
"location": "Tracy, CA",
"followers_count": 4599,
"friends_count": 551
},
"text": "In case anyone is interested in learning about how to work with streaming data using Node.js, I wrote a tutorial about it on the @MongoDB Developer Hub. https://t.co/Dxt80lD8xj #javascript"
}
Without getting too far ahead of ourselves, our analysis will be based off the followers_count
and the location
of the user. We want to be able to make sense of where our users are and give priority to users that meet a certain followers threshold.
Before we start connecting Zapier and MongoDB, we need to develop the middleware that will be responsible for receiving tweet data from Zapier.
Remember, you’ll need to have a properly configured MongoDB Atlas cluster.
We need to create a Realm application. Within the MongoDB Atlas dashboard, click the Realm tab.
For simplicity, we’re going to want to create a new application. Click the Create a New App button and proceed to fill in the information about your application.
From the Realm Dashboard, click the 3rd Party Services tab.
We’re going to want to create an HTTP service. The name doesn’t matter, but it might make sense to name it Twitter based on what we’re planning to do.
Because we plan to work with tweet data, it makes sense to call our webhook function tweet, but the name doesn’t truly matter.
With the exception of the HTTP Method, the defaults are fine for this webhook. We want the method to be POST because we plan to create data with this particular webhook function. Make note of the Webhook URL because it will be used when we connect Zapier.
The next step is to open the Function Editor so we can add some logic behind this function. Add the following JavaScript code:
exports = function (payload, response) {
const tweet = EJSON.parse(payload.body.text());
const collection = context.services.get("mongodb-atlas").db("zapier").collection("tweets");
return collection.insertOne(tweet);
};
In the above code, we are taking the request payload, getting a handle to the tweets collection within the zapier database, and then doing an insert operation to store the data in the payload.
There are a few things to note in the above code:
When we call our function, a new document should be created within MongoDB.
By default, the function will not deploy when saving. After saving, make sure to review and deploy the changes through the notification at the top of the browser window.
So, we know the data we’ll be working with and we have a MongoDB Realm webhook function that is ready for receiving data. Now, we need to bring everything together with Zapier.
For clarity, new Twitter matches will be our trigger in Zapier, and the webhook function will be our event.
Within Zapier, choose to create a new “Zap,” which is an automation. The trigger needs to be a Search Mention in Twitter, which means that when a new Tweet is detected using a search query, our events happen.
For this example, we’re going to use the following Twitter search query:
url:developer.mongodb.com -filter:retweets filter:safe lang:en -from:mongodb -from:realm
The above query says that we are looking for tweets that include a URL to developer.mongodb.com. The URL doesn’t need to match exactly as long as the domain matches. The query also says that we aren’t interested in retweets. We only want original tweets, they have to be in English, and they have to be detected as safe for work.
In addition to the mentioned search criteria, we are also excluding tweets that originate from one of the MongoDB accounts.
In theory, the above search query could be used to see what people are saying about the MongoDB Developer Hub.
With the trigger in place, we need to identify the next stage of the automation pipeline. The next stage is taking the data from the trigger and sending it to our Realm webhook function.
As the event, make sure to choose Webhooks by Zapier and specify a POST request. From here, you’ll be prompted to enter your Realm webhook URL and the method, which should be POST. Realm is expecting the payload to be JSON, so it is important to select JSON within Zapier.
We have the option to choose which data from the previous automation stage to pass to our webhook. Select the fields you’re interested in and save your automation.
The data I chose to send looks like this:
{
"created_at": "Tue Feb 02 20:31:58 +0000 2021",
"username": "nraboy",
"location": "Tracy, CA",
"follower_count": "4599",
"following_count": "551",
"message": "In case anyone is interested in learning about how to work with streaming data using Node.js, I wrote a tutorial about it on the @MongoDB Developer Hub. https://t.co/Dxt80lD8xj #javascript"
}
The fields do not match the original fields brought in by Twitter. It is because I chose to map them to what made sense for me.
When deploying the Zap, anytime a tweet is found that matches our query, it will be saved into our MongoDB cluster.
With tweet data populating in MongoDB, it’s time to start querying it to make sense of it. In this fictional example, we want to know what people are saying about our Developer Hub and how popular these individuals are.
To do this, we’re going to want to make use of an aggregation pipeline within MongoDB.
Take the following, for example:
[
{
"$addFields": {
"follower_count": {
"$toInt": "$follower_count"
},
"following_count": {
"$toInt": "$following_count"
}
}
}, {
"$match": {
"follower_count": {
"$gt": 1000
}
}
}, {
"$group": {
"_id": {
"location": "$location"
},
"location": {
"$sum": 1
}
}
}
]
There are three stages in the above aggregation pipeline.
We want to understand the follower data for the individual who made the tweet, but that data comes into MongoDB as a string rather than an integer. The first stage of the pipeline takes the follower_count
and following_count
fields and converts them from string to integer. In reality, we are using $addFields
to create new fields, but because they have the same name as existing fields, the existing fields are replaced.
The next stage is where we want to identify people with more than 1,000 followers as a person of interest. While people with fewer followers might be saying great things, in this example, we don’t care.
After we’ve filtered out people by their follower count, we do a group based on their location. It might be valuable for us to know where in the world people are talking about MongoDB. We might want to know where our target audience exists.
The aggregation pipeline we chose to use can be executed with any of the MongoDB drivers, through the MongoDB Atlas dashboard, or through the CLI.
You just saw how to use Zapier with MongoDB to automate certain tasks and store the results as documents within the NoSQL database. In this example, we chose to store Twitter data that matched certain criteria, later to be analyzed with an aggregation pipeline. The automations and analysis options that you can do are quite limitless.
If you enjoyed this tutorial and want to get engaged with more content and like-minded developers, check out the MongoDB Community.
This content first appeared on MongoDB.
Original article source at: https://www.thepolyglotdeveloper.com/
1597108140
Thinking about creating a job board for your niche? Or have you identified a field you think could use with some help for increased employment? A survey by Allegis reported that the internet is where people are most commonly looking for jobsso if you’re reading this, you’re already off on the right foot.
Now, figuring out how to build your own job board can be overwhelming when giants like Indeed, Glassdoor and Seek exists, but let’s start small. You can build your own with Pory.io using our job template made withAirtable.
This article will teach you
When you’re done, you should have a little something that looks like this demo:
Okay, if I’ve still got you here, I think that means you’re ready to begin creating your own job board with us so let’s get started.
Creating your site in 6 steps
5. Insert the base ID of the selected base, “Pory Job Template”
I know this looks a little intimidating if you don’t understand most of it but the ID of the base highlighted is what we are after. Don’t sweat the rest!
6. Your Pory.io home page should look a little something like this. Viola! You’ve created your site.
Now let’s open up the site we just created by clicking on the edit icon and selecting View data to see how the site connects to Airtable.
You should see 3 tables inside the template:
With this in mind, we can begin creating the following features for your board! 🔥
Some bonus bits to add:
Easy peasy, let me show you.
Head over to our **Posted Jobs **table and select the **Submit a job **form view over on the left. This is the form we’ll use to allow users to submit new job listings, so go ahead and edit the field names to fit your platform needs.
Next, you want to embed this form you’ve created to your site. Click the **Share form **button on the top left hand corner and copy the Embed Code in the new window that pops up:
Return to your site editor and **Edit **the top section your form is in. Paste the embedded form code you’ve copied into the right hand panel, under the “Airtable” tab.
Filters for the job board can be configured in the “Filters” table. Below, we’ve used popular ones, “Location” and “Position” in the columns Filter 1 and Filter 2 respectively.
Here’s a link to more filter help from our team.
#airtable #nocode #nocode-tutorials #no-code-platform #pory #zapier #indie-hackers #bootstrapping
1595560127
Learn how to use Zapier to automate every day tasks in your life. We’ll walk through setting up an automation for sending you a text if it’s going to rain, setting a print job to print a test page once a week, and sending you an email notification every time there’s a new job via RSS feed.
Every day, we perform hundreds of small tasks. On their own, they don’t take much time. But they can add up, especially if you consider that time for a whole year.
But we’re technologists and it’s 2020. How can we use tools like Zapier to make robots do these things for us?
Zapier is an automation tool that connects all of the apps you love and builds powerful fully automated workflows. Whether it’s automating sending an email or making sure that new blog post gets a tweet, we can remove the manual steps of mundane tasks to focus on other things that art important.
Each time you create a new workflow, you’re creating a “Zap”. It’s essentially Zapier’s way to give a name to the workflow you create.
The brilliant part about Zapier is each app integration makes its API available via Zapier to other app integrations giving you a ton of options to connect and build powerful workflows.
Particularly, we’re going to learn how to do a few things:
While each of these tasks are small, they end up saving you a lot of time. And if you’re creative, you can build upon these workflows to customize a whole lot more.
Before we get into setting up workflows, you’ll need an account.
Signing up for Zapier is free and you get 5 free Zaps to start, so we don’t have to worry about cost here.
Now let’s get into the Zaps.
To get an idea of how this works, we’ll start with something simple. We’re going to set up a Zap that will send us a text message if the weather predicts rain.
To get started, click the big Make a Zap button on the top left of the page when you’re logged into your account.
Making a new Zap
Here, Zapier wants to know the first app we want to connect. Since we’re going to base our Zap on the weather, search for “weather” and select Weather by Zapier.
Selecting the Weather by Zapier integration
It will then ask you to choose a Trigger Event, where you’ll select “Will It Rain Today?”, then you can click the Continue button.
Choosing the Will it Rain Today? event
When choosing Weather as an event, it requires a little bit of information to give us a personalized prediction. Particularly, it requires your Latitude and Longitude, which we can look up using latlong.net.
Finding latitude and longitude with latlong.net
You can then enter your Latitude and Longitude into the Customize Forecast screen of Zapier, select your Units which defaults to Fahrenheit, and then click the big Continue button.
Configuring forecast with latitude and longitude
At this point, you can click Test Trigger, which simply makes sure it’s working, and click Continue again.
Now we’re going to tell Zapier what to do with the information once it knows it’s going to rain.
In the “Do this…” panel, search for “sms” and select SMS by Zapier.
Selecting SMS by Zapier
We’re going to leave the App and Event as the defaults, so the next screen you can just click Continue.
SMS App and Event
Now, for Zapier to send you a text, it needs to verify that your phone number belongs to you or that the phone number is knowingly signing up for these texts. To do that, it sends you a one-time PIN that you’ll enter.
So click Sign in to SMS by Zapier, which will open a popup window.
Here, enter your phone number, and choose Sms or Call as the verification method at which point it will contact you with a PIN.
Sign in to SMS by Zapier and send a PIN
With that PIN, enter it into the field and click Continue.
Entering the SMS verification PING
At this point the window will close and move you back to the original flow. Here, click Continue again.
Now we get to customize the text that we receive.
In the From Number field, Zapier gives a bunch of phone numbers you can use. You can either select one number to always send from, which you can set up as a contact so you know it’s Zapier, or you can select Random, which will use a random number every time.
Then, click inside of Message, and it will bring up some options. I want to know everything possible if it’s going to rain, including the probability, max temperature, and summary, so we can select all or as much as we want and again click Continue.
Configuring Weather message
Finally we get to test if our Zap worked. At this point everything should be configured, so click the Test & Review button and you should receive a sample text!
Note: If you choose a single From Number, you might be limited in how frequent you can receive texts, so if you don’t get it right away, that might be why. Setting random helps prevent that issue, but the the number isn’t consistent.
And once you’re happy with the configuration, you can click Turn On Zap.
Turning on the SMS Zap
You’ll now get texts in the morning if the weather is predicting rain!
This one doesn’t sound exciting, but have you ever gone through a long period of time where you didn’t print something, only to end up with dried out printer heads or worse yet, a now unsalvageable printer?
We can avoid this by simply running a weekly print job that keeps our printer ink nice and fresh.
For this, we’ll use Google Cloud Print. To make this work, you’ll need to already have this configured with your Google account.
Let’s create a new Zap and this time for our “When this happens…” search for and select Schedule by Zapier.
Selecting Schedule by Zapier
We can then select a Trigger Event of Every Week and click Continue.
Setting Trigger Event as Every Week
Next, you can choose the Day Of The Week and Time Of Day you’d like to print. I personally run this job weekly at 8pm on Sundays right before the start of a new week. Once configured, click Continue.
Configuring Zapier schedule
At this point, we can click Test trigger, which just like before makes sure it’s working properly, and then we can click Continue.
Now, for our “Do this…” we want to print, so search for and select Google Cloud Print.
Selecting Google Cloud Print
And for the action, select Submit Print Job.
Setting Submit Print Job as Action Event
At this point, you’ll need to sign into Google Cloud Print. This will open a window and have you log in through Google so that Zapier can interface with the service.
Once connected, click Continue.
Now we can configure out print job. Here we’ll want to define what we print.
Configuring Print Job in Zapier
In the above, we’re configuring:
The rest of the fields are optional, feel free to customize to your liking.
With our configuration set, click Continue, and similar to before, we can click Test to see our print job in action and if we’re happy, we can click Turn on Zap!
Test print from Zapier print job
If you want to use the same document, you can find it here: https://fay.io/printer-test.pdf
If we’re looking for a job, it can be a pain to have to visit every job board every day (or every hour, am I right?). But we can automate this process when the job board supports it.
Luckily, job boards like Smashing Magazine and a whole lot of others provide RSS feeds which we can hook right in to Zapier to automate getting an email whenever a new job is posted.
To get started, let’s create a new Zap, and this time, search for RSS and select RSS by Zapier.
Selecting RSS by Zapier
For our Trigger Event, select New Item in Feed, then click Continue.
Setting New Item in Feed as Trigger Event
At this point, we want to enter a feed URL. This will be the URL to the XML RSS feed that websites make available. For Smashing Magazine, you can find it here:
https://www.smashingmagazine.com/jobs/feed
So enter that URL above into Feed URL (you can leave Username and Password blank), and keep Different GUID selected for What Triggers A New Feed Item. Then click Continue.
Setting RSS Feed URL
Same as usual, now you can test the trigger to make sure it works. If the RSS feed is valid, this will be smooth, otherwise you might see an error. The above URL should be valid!
Testing the RSS feed
Next, we need to choose what we want to do with the new item. Since we want it emailed, we can choose our email service, which in my case is Gmail.
Selecting Gmail in Zapier
For our action, we want to Send Email.
Setting Action Event as Send Email
Next, you want to sign in to your account, similar to what we did with Google Cloud Print. This should be your Google account that you use Gmail with.
Sign in to Gmail
Now when we customize our email, we want to include the following:
Configuring job notification email
Once you’re done configuring, you can hit continue. Then, similar to before, click Test & Review, and you should receive your test email.
Test job alert email
Finally if you’re happy with the configuration, turn on the Zap, and enjoy your job search!
Here’s more ideas to get you moving in the right direction:
The only thing that it’s missing for me right now is Google Assistant, otherwise I would have included some Zap ideas for that. IFTTT supports Google Assistant for simpler flows, but Zapier can get more powerful.
#zapier #developer
1593355260
Zapier is a great product for connecting different services by their API’s. It makes it simple to connect up different services to automate your work flow. You’re probably a developer, so you’ll really appreciate how simple Zapier makes things!
For example, with Raygun Webhooks, you might listen for them and
Post error notifications to an RSS feed
Send a message to your chat service of choice
Send a notification to a service like PagerDuty
or, trigger literally thousands of other things. It makes connecting services together a breeze.
#raygun #test #zapier #integration #testing
1589242071
Once arriving at the form that Zapier prompts you with for your PostgreSQL information, you should be able to enter all the server information and user credentials, and the connection should work right away! No cryptic error messages.
#development #zapier #postgres #google-sheets