Save yourself time when coming back to previous work using the git commands diff, status and a bit of bash.
On Wednesday, March 11, 2020, I conducted the webinar titled “Monitoring & Orchestrating Your Microservices Landscape using Workflow Automation”. Not only was I overwhelmed by the number of attendees, but we also got a huge list of interesting questions before and especially during the webinar. Some of them were answered, but a lot of them were not. I want to answer all open questions in this series of seven blog posts. Today I am posting the final two in the series.
The new profile page showcase information about you, your work, skills and your blog posts
Data scientists do lots of exploration and experimentation. Jupyter Notebook (Notebook from here onwards) is a great tool for exploring and experimenting. However, things can get cluttered and messy quickly when using Notebook.
Archiving and Logging Your Use of Public Data. Dealing with the impermanence of public data sets
A curated list with tools that can improve your workflow as a web developer. Depending on what you do, there are often a few tools that you use from time to time that help you in your work. Some of them can even have ...
If you are a Data Scientist looking to make it to the next level, then there are many opportunities to up your game and your efficiency to stand out from the others. Some of these recommendations that you can follow are straightforward, and others are rarely followed, but they will all pay back in dividends of time and effectiveness for your career.
In this article, I will show how a programming language can be used as a DSL by a “workflow as code” engine, and why it’s probably your best long-term option for building reliable automation at any scale.
With Pachyderm and Actions, the MLOps process can be automated. Engineers and data scientists can write their machine learning code, and it seamlessly and automatically gets deployed to a production-scale data workflow.
When you start working on large scale enterprise systems, handling releases can become complex. You'll have to think about your front-end, microservices, third-party services, and other services. Making sure these things get deployed in the right order and pass integration tests can be tricky once you start working with asynchronous tasks.
In this article, we will learn about KNIME and discuss how to use this tool for building a machine learning model from scratch.
Learn the most popular git commands and a simple but effective branching model. Software developers usually write tons of code every single day. They might be working on a new project or tweaking an existing one. But, after some time the codebase may grow enough that it is very difficult to manage or track any changes.
Orchestrate parallel jobs on K8s with container-native workflow engine. Argo Workflows is an open-source container-native workflow engine for orchestrating parallel jobs on K8s. Argo Workflows are implemented as a K8s CRD (Custom Resource Definition).
With over 150 nodes, n8n saves countless hours by automating repetitive tasks. But what happens when n8n doesn’t have a node for a tool you love? In this article, we are going to explore three examples to showcase how the HTTP Request node can be used in your workflow to automate tasks
Build an Airflow data pipeline to monitor errors and send alert emails automatically. The story provides detailed steps with screenshots.
A beginner’s guide to the basic concepts of Apache Airflow. This is a memo to share what I have learnt in Apache Airflow, capturing the learning objectives as well as my personal notes. The course is taught by Mike Metzger from DataCamp, and it includes 4 chapters.
Setting up and creating your first workflow.Airflow was born out of Airbnb’s problem of dealing with large amounts of data that was being used in a variety of jobs. To speed up the end-to-end process, Airflow was created to quickly author, iterate on, and monitor batch data pipelines. Airflow later joined Apache.
Once there were two sons of two entrepreneurs. They both decided to follow their fathers’ examples and build businesses out of ideas and relationships.
What’s New in Bayesnote v0.0.1. Bayesnote is a frictionless integrated notebook environment for data scientists and data engineers.
Argo Workflows is an open source container-native workflow engine for orchestrating parallel jobs on K8s. Argo Workflows is implemented as a K8s CRD (Custom Resource Definition)