Using Kaggle Datasets with Google Colab

We are all aware of the fact that google keeps on delivering the best possible solutions for most of our problems. One such thing is the Google Colab notebook. These notebooks use google’s cloud servers and they offer a GPU as well as TPU runtime environment [ and all this is free! Thanks Google :) ]. All the ML, DL, AI enthusiasts should definitely try out Colab notebooks.

Image for post

Let’s get started

Remember to sign up and register on Kaggle before diving into this. Now let’s get started. Log in to your Kaggle account and on the top right corner where you can see your profile icon, go to **My account **in the dropdown menu.

Image for post

Scroll down to the API section to see a **“Create New API Token” **option. Click on this to download a kaggle.json file. If you open this file with a text editor you will see two fields **“username” **and “key”. This is the file that we will be adding to our colab notebook later on.

Moving on to Colab Notebook

Start a new notebook at **colab.research.google.com **. On the menu panel you will see a Runtime option where you can change runtime type and decide whether to use a standard, GPU based or TPU based environment. By default colab runs on a standard environment.

Now it’s time to install Kaggle. The installation commands are very similar to the ones used in Jupyter notebooks. Run the following commands :

!pip install kaggle

Now create a folder named kaggle as shown below

!mkdir .kaggle

Note that .kaggle will be a hidden directory. To see if it has been succesfully created you can check with !ls -la command. Now run the following code.

import json
token = {"username":"YOUR_USERNAME","key":"AUTHENTICATION_KEY"}
with open('/content/.kaggle/kaggle.json','w') as file:
    json.dump(token,file)

#kaggle #machine-learning #deep-learning #google-colab #deep learning

What is GEEK

Buddha Community

Using Kaggle Datasets with Google Colab
Jon  Gislason

Jon Gislason

1619247660

Google's TPU's being primed for the Quantum Jump

The liquid-cooled Tensor Processing Units, built to slot into server racks, can deliver up to 100 petaflops of compute.

The liquid-cooled Tensor Processing Units, built to slot into server racks, can deliver up to 100 petaflops of compute.

As the world is gearing towards more automation and AI, the need for quantum computing has also grown exponentially. Quantum computing lies at the intersection of quantum physics and high-end computer technology, and in more than one way, hold the key to our AI-driven future.

Quantum computing requires state-of-the-art tools to perform high-end computing. This is where TPUs come in handy. TPUs or Tensor Processing Units are custom-built ASICs (Application Specific Integrated Circuits) to execute machine learning tasks efficiently. TPUs are specific hardware developed by Google for neural network machine learning, specially customised to Google’s Machine Learning software, Tensorflow.

The liquid-cooled Tensor Processing units, built to slot into server racks, can deliver up to 100 petaflops of compute. It powers Google products like Google Search, Gmail, Google Photos and Google Cloud AI APIs.

#opinions #alphabet #asics #floq #google #google alphabet #google quantum computing #google tensorflow #google tensorflow quantum #google tpu #google tpus #machine learning #quantum computer #quantum computing #quantum computing programming #quantum leap #sandbox #secret development #tensorflow #tpu #tpus

Embedding your <image> in google colab <markdown>

This article is a quick guide to help you embed images in google colab markdown without mounting your google drive!

Image for post

Just a quick intro to google colab

Google colab is a cloud service that offers FREE python notebook environments to developers and learners, along with FREE GPU and TPU. Users can write and execute Python code in the browser itself without any pre-configuration. It offers two types of cells: text and code. The ‘code’ cells act like code editor, coding and execution in done this block. The ‘text’ cells are used to embed textual description/explanation along with code, it is formatted using a simple markup language called ‘markdown’.

Embedding Images in markdown

If you are a regular colab user, like me, using markdown to add additional details to your code will be your habit too! While working on colab, I tried to embed images along with text in markdown, but it took me almost an hour to figure out the way to do it. So here is an easy guide that will help you.

STEP 1:

The first step is to get the image into your google drive. So upload all the images you want to embed in markdown in your google drive.

Image for post

Step 2:

Google Drive gives you the option to share the image via a sharable link. Right-click your image and you will find an option to get a sharable link.

Image for post

On selecting ‘Get shareable link’, Google will create and display sharable link for the particular image.

#google-cloud-platform #google-collaboratory #google-colaboratory #google-cloud #google-colab #cloud

Using Kaggle Datasets with Google Colab

We are all aware of the fact that google keeps on delivering the best possible solutions for most of our problems. One such thing is the Google Colab notebook. These notebooks use google’s cloud servers and they offer a GPU as well as TPU runtime environment [ and all this is free! Thanks Google :) ]. All the ML, DL, AI enthusiasts should definitely try out Colab notebooks.

Image for post

Let’s get started

Remember to sign up and register on Kaggle before diving into this. Now let’s get started. Log in to your Kaggle account and on the top right corner where you can see your profile icon, go to **My account **in the dropdown menu.

Image for post

Scroll down to the API section to see a **“Create New API Token” **option. Click on this to download a kaggle.json file. If you open this file with a text editor you will see two fields **“username” **and “key”. This is the file that we will be adding to our colab notebook later on.

Moving on to Colab Notebook

Start a new notebook at **colab.research.google.com **. On the menu panel you will see a Runtime option where you can change runtime type and decide whether to use a standard, GPU based or TPU based environment. By default colab runs on a standard environment.

Now it’s time to install Kaggle. The installation commands are very similar to the ones used in Jupyter notebooks. Run the following commands :

!pip install kaggle

Now create a folder named kaggle as shown below

!mkdir .kaggle

Note that .kaggle will be a hidden directory. To see if it has been succesfully created you can check with !ls -la command. Now run the following code.

import json
token = {"username":"YOUR_USERNAME","key":"AUTHENTICATION_KEY"}
with open('/content/.kaggle/kaggle.json','w') as file:
    json.dump(token,file)

#kaggle #machine-learning #deep-learning #google-colab #deep learning

Inside ABCD, A Dataset To Build In-Depth Task-Oriented Dialogue Systems

According to a recent study, call centre agents’ spend approximately 82 percent of their total time looking at step-by-step guides, customer data, and knowledge base articles.

Traditionally, dialogue state tracking (DST) has served as a way to determine what a caller wants at a given point in a conversation. Unfortunately, these aspects are not accounted for in popular DST benchmarks. DST is the core part of a spoken dialogue system. It estimates the beliefs of possible user’s goals at every dialogue turn.

To reduce the burden on call centre agents and improve the SOTA of task-oriented dialogue systems, AI-powered customer service company ASAPP recently launched an action-based conversations dataset (ABCD). The dataset is designed to help develop task-oriented dialogue systems for customer service applications. ABCD consists of a fully labelled dataset with over 10,000 human dialogues containing 55 distinct user intents requiring sequences of actions constrained by company policies to accomplish tasks.

https://twitter.com/asapp/status/1397928363923177472

The dataset is currently available on GitHub.

#developers corner #asapp abcd dataset #asapp new dataset #build enterprise chatbot #chatbot datasets latest #customer support datasets #customer support model training #dataset for chatbots #dataset for customer datasets

Art  Lind

Art Lind

1602990000

Get Google Trends using Python

In this post, we will show how we can use Python to get data from Google Trends. Let’s have a look at the top trending searches for today in the US (14th of March, 2020). As we can see, the top search is about Coronavirus tips with more than 2M searches, and at the 7th position is Rick Pitino with around 100K searches.

Image for post

Python package for getting the Google Trends

We will use the pytrends package which is an unofficial API for Google Trends which allows a simple interface for automating downloading of reports from Google Trends. The main feature is to allow the script to login to Google on your behalf to enable a higher rate limit. At this point, I want to mention that I couldn’t use this package and I created a new anaconda environment installing the pandas 0.25 version.

You can install the pytrends package with pip:

pip install pytrends

#google-trends #how-to-use-google-trend #google #google-api #python