Castore  DeRose

Castore DeRose

1567763435

Creating an Audio Player in React Native

React Native development revolves around some common interface patterns that you should practice. One common UI built-in mobile application is an audio player.

In this tutorial, you are going to build a functioning interface for an audio player with common functionalities like

  • Load the audio file;
  • Play/pause the audio file;
  • Navigate to next track;
  • Navigate to the previous track.

Apart from building the user interface, you are also going to learn a lot about using the expo-av module. This module provides an API for any Expo application to consume for media playback. Also, this module contains APIs both for audio and video media, but here we are only going to look at the audio portion.

You will find the complete code for this tutorial at this GitHub repository.

What Are We Building?

The end result of this React Native tutorial is to have an audio player that can play tracks from remote audio files. For the demonstration, the app is going to use audio files related to a play written by William Shakespeare from Librivox. All these audio files are available under the public domain, so you do not have to worry about copyright issues.

Requirements

To follow this tutorial, please make sure you have the following installed on your local development environment and have access to the services mentioned below:

  • Nodejs (>=10.x.x) with npm/yarn installed.
  • expo-cli (>= 3.x.x), previously known as create-react-native-app.
  • Mac users must be running an iOS simulator.
  • Windows/Linux users must be running an Android emulator.

To know more about how to setup and run the simulator or the emulator on your local development environment visit React Native’s official documentation here.

Getting Started

To start, you first have to initialize a new React Native project using the expo-cli tool. The only requirement right now is to have expo-cli installed. Then, create a new project directory, navigate to it, and install the required dependency to add the functionality of playing an audio file inside the React Native app.


expo init music-player-expo
 
# navigate inside the app folder
cd music-player-expo
 
# install the following dependency
npm install expo-av

The dependency expo-av will help you use the Audio API and its promise-based asynchronous methods to play the audio files within the React Native app. The source of these audio files can be local or remote.

Once you have generated the app and installed the dependency, execute the command below to open the boilerplate application that comes with expo-cli.

expo start

The following screen will welcome you:

Since the app will be consuming a bunch of audio files from a remote resource, it is better if you create an array that will contain details related to each of the audio files and their resource in the form of a URI. Open App.js and add the following array before the App component.


import React from 'react'
import { StyleSheet, Text, View } from 'react-native'
 
const audioBookPlaylist = [
  {
    title: 'Hamlet - Act I',
    author: 'William Shakespeare',
    source: 'Librivox',
    uri:
      'https://ia800204.us.archive.org/11/items/hamlet_0911_librivox/hamlet_act1_shakespeare.mp3',
    imageSource: 'http://www.archive.org/download/LibrivoxCdCoverArt8/hamlet_1104.jpg'
  },
  {
    title: 'Hamlet - Act II',
    author: 'William Shakespeare',
    source: 'Librivox',
    uri:
      'https://ia600204.us.archive.org/11/items/hamlet_0911_librivox/hamlet_act2_shakespeare.mp3',
    imageSource: 'http://www.archive.org/download/LibrivoxCdCoverArt8/hamlet_1104.jpg'
  },
  {
    title: 'Hamlet - Act III',
    author: 'William Shakespeare',
    source: 'Librivox',
    uri: 'http://www.archive.org/download/hamlet_0911_librivox/hamlet_act3_shakespeare.mp3',
    imageSource: 'http://www.archive.org/download/LibrivoxCdCoverArt8/hamlet_1104.jpg'
  },
  {
    title: 'Hamlet - Act IV',
    author: 'William Shakespeare',
    source: 'Librivox',
    uri:
      'https://ia800204.us.archive.org/11/items/hamlet_0911_librivox/hamlet_act4_shakespeare.mp3',
    imageSource: 'http://www.archive.org/download/LibrivoxCdCoverArt8/hamlet_1104.jpg'
  },
  {
    title: 'Hamlet - Act V',
    author: 'William Shakespeare',
    source: 'Librivox',
    uri:
      'https://ia600204.us.archive.org/11/items/hamlet_0911_librivox/hamlet_act5_shakespeare.mp3',
    imageSource: 'http://www.archive.org/download/LibrivoxCdCoverArt8/hamlet_1104.jpg'
  }
]
 
export default function App() {
  return (
    <View style={styles.container}>
      <Text>Open up App.js to start working on your app!</Text>
    </View>
  )
}
 
const styles = StyleSheet.create({
  container: {
    flex: 1,
    backgroundColor: '#fff',
    alignItems: 'center',
    justifyContent: 'center'
  }
})


In the above snippet, imageSource is going to provide an album or an audiobook cover.

Define an initial state in the App Component

In this section, you are going to convert the functional App component that comes with the default Expo app into a class component. This conversion will be useful to define an initial state that will hold an object with properties like:

  • isPlaying to check whether the audio player is playing the audio file or not. This is going to be a boolean value.
  • playbackInstance to hold the instance of the current track being played.
  • volume the current volume of the audio for this media.
  • currentIndex to gather the index of which track is currently being played. This helps in navigating and playing the next and the previous track from the audioBookPlaylist array.
  • isBuffering holds a boolean value to check whether the current media is being buffered.

The initial state of the App component is going to look like the below snippet. Open App.js to add the state.


export default class App extends React.Component {
  state = {
    isPlaying: false,
    playbackInstance: null,
    currentIndex: 0,
    volume: 1.0,
    isBuffering: false
  }
  render() {
    return (
      <View style={styles.container}>
        <Text>Open up App.js to start working on your app!</Text>
      </View>
    )
  }
}

Building the UI: Audio Player Controls

In this section, let us build the UI components of how the basic audio player is going to look like. To start, please make sure that you are importing React Native elements like TouchableOpacity and Image from the core. Also, to add icons, let us import Ionicons from the library [@expo/vector-icons](https://github.com/expo/vector-icons).

This package comes with the Expo app, so you do not have to undergo the process of installing it as a separate module. This demo is going to use Ionicons from this package but feel free to use another icon library.


import { StyleSheet, TouchableOpacity, View, Image } from 'react-native'
import { Ionicons } from '@expo/vector-ico

The next step is to modify the render function inside App.js. Inside the container view, you are going to add an image that will display the cover of the audio book from the resource. Beneath this cover image, there will be three buttons that will let you control the audio files within the app.


<View style={styles.container}>
  <Image
    style={styles.albumCover}
    source={{ uri: 'http://www.archive.org/download/LibrivoxCdCoverArt8/hamlet_1104.jpg' }}
  />
  <View style={styles.controls}>
    <TouchableOpacity style={styles.control} onPress={() => alert('')}>
      <Ionicons name='ios-skip-backward' size={48} color='#444' />
    </TouchableOpacity>
    <TouchableOpacity style={styles.control} onPress={() => alert('')}>
      {this.state.isPlaying ? (
        <Ionicons name='ios-pause' size={48} color='#444' />
          ) : (
        <Ionicons name='ios-play-circle' size={48} color='#444' />
      )}
    </TouchableOpacity>
    <TouchableOpacity style={styles.control} onPress={() => alert('')}>
      <Ionicons name='ios-skip-forward' size={48} color='#444' />
    </TouchableOpacity>
  </View>
</View>


The conditional rendering implied on the second button states that whenever the boolean value of isPlaying is changed to true, the UI will display a pause button instead of a play button. Each button is accumulating an icon.

All of these buttons are going to be inside another view with a specific styling. You will notice the same thing in the above snippet. Outside the class component, using a StyleSheet object, let us add the styling.


const styles = StyleSheet.create({
  container: {
    flex: 1,
    backgroundColor: '#fff',
    alignItems: 'center',
    justifyContent: 'center'
  },
  albumCover: {
    width: 250,
    height: 250
  },
  controls: {
    flexDirection: 'row'
  },
  control: {
    margin: 20
  }
})

To provide styles to your React Native components, there are no classes or IDs in React Native like in web development. To create a new style object, you use the StyleSheet.create() method. When creating a new style object every time the component renders, StyleSheet creates style objects with IDs that are further used to reference instead of rendering the whole component again and again.

Execute the command expo start from a terminal window, if you haven’t already, and you will get the following result.

Exploring the Audio API

To play a sound in an Expo application, you’re required to use and import the API for the Audio class from expo-av. So at the top of the App.js file and after other imports, you can add the following line.

import { Audio } from 'expo-av'


To customize the audio experience inside an iOS or an Android app, Expo provides an asynchronous method called setAudioModeAsync(). This method takes an options object as its only parameter. This object contains a list of key-value pairs that are required to enable and use the audio component.

Inside the App component, you are going to add a lifecycle method componentDidMount(). This method should be defined after the initial state. It will help you configure the Audio component from the expo-av module.


async componentDidMount() {
    try {
      await Audio.setAudioModeAsync({
        allowsRecordingIOS: false,
        interruptionModeIOS: Audio.INTERRUPTION_MODE_IOS_DO_NOT_MIX,
        playsInSilentModeIOS: true,
        interruptionModeAndroid: Audio.INTERRUPTION_MODE_ANDROID_DUCK_OTHERS,
        shouldDuckAndroid: true,
        staysActiveInBackground: true,
        playThroughEarpieceAndroid: true
      })
 
      this.loadAudio()
    } catch (e) {
      console.log(e)
    }
  }

Let us take all the options that are being passed in the setAudioModeAsync method. These options will define how the audio player is going to behave.

The allowsRecordingIOS is a boolean which, when enabled, will allow recording in iOS devices. The playsInSilentModeIOS indicates whether the audiobook app should play while the device is in silent mode.

The interruptionModeIOS & interruptionModeAndroid is how the audio of the app will behave with the audio of other apps. For example, what if you receive a call while listening to the audio player? How will the audio from the audiobook app behave? The value of these two options sets that. Currently, the option for the iOS device is set to be interrupted by the audio of other apps, hence INTERRUPTION_MODE_IOS_DO_NOT_MIX.

However, in the case of Android, the value INTERRUPTION_MODE_ANDROID_DUCK_OTHERS indicates that the volume of the audio from other apps will be lowered while the audiobook app is running. This term, Duck is known as lowering the volume. To set this option for Android, you have to set the value of shouldDuckAndroid to true.

Lastly, the lifecycle method is going to trigger the loadAudio function, which you are going to see in action in the next section.

Loading the Audio File

After the lifecycle method componentDidMount() inside the App.js file, you are going to enter another asynchronous function called loadAudio(). This function will handle the loading of the audio file for the app’s player.


async loadAudio() {
  const {currentIndex, isPlaying, volume} = this.state
 
  try {
    const playbackInstance = new Audio.Sound()
    const source = {
      uri: audioBookPlaylist[currentIndex].uri
    }
 
    const status = {
      shouldPlay: isPlaying,
      volume
    }
 
    playbackInstance.setOnPlaybackStatusUpdate(this.onPlaybackStatusUpdate)     
    await playbackInstance.loadAsync(source, status, false)
    this.setState({playbackInstance})
    } catch (e) {
      console.log(e)
    }
}
 
onPlaybackStatusUpdate = status => {
  this.setState({
    isBuffering: status.isBuffering
  })
}

The new Audio.Sound() allows you to create an instance that will take the source of the audio file (which can be either from a local asset file or a remote API URI like in the current scenario). From the state property currentIndex the Audio instance created will find the index value in the array of audioBookPlaylist to read the source URI and play the audio file.

On the instance of Audio, a method called setOnPlaybackStatusUpdate is used. This method has a handler function being passed, which is known as onPlaybackStatusUpdate. This handler function is responsible for updating the UI whether the media is being currently buffered or being played. To track the state of buffering, isBuffering is used from the initial state property. Whenever the state of the Audio instance changes, this gets an update.

Lastly, the loadAsync function is called on the Audio instance, which takes in three parameters. This first parameter is the source of the audio file. The second parameter indicates the status of the object. This status object further uses the properties of shouldPlay and volume. The value of the property shouldPlay is indicated by isPlaying from the initial state object. The last boolean value passed in loadAsync indicates whether the audio player app should download the audio file before playing. In the current scenario, there is no requirement for that. Thus, it has been set to false.

Control Handlers

After the previous section, let us add three new methods which are going to control the state of the audio instance being played or paused. Also, changing to the next track or the previous track is going to be represented by different handler functions. Further, these handler functions are going to be used on onPress props of each button created in the UI section.


handlePlayPause = async () => {
    const { isPlaying, playbackInstance } = this.state
    isPlaying ? await playbackInstance.pauseAsync() : await playbackInstance.playAsync()
 
    this.setState({
      isPlaying: !isPlaying
    })
  }
 
    handlePreviousTrack = async () => {
    let { playbackInstance, currentIndex } = this.state
    if (playbackInstance) {
      await playbackInstance.unloadAsync()
      currentIndex < audioBookPlaylist.length - 1 ? (currentIndex -= 1) : (currentIndex = 0)
      this.setState({
        currentIndex
      })
      this.loadAudio()
    }
  }
 
  handleNextTrack = async () => {
    let { playbackInstance, currentIndex } = this.state
    if (playbackInstance) {
      await playbackInstance.unloadAsync()
      currentIndex < audioBookPlaylist.length - 1 ? (currentIndex += 1) : (currentIndex = 0)
      this.setState({
        currentIndex
      })
      this.loadAudio()
    }
  }

The handlePlayPause checks the value of isPlaying to decide whether to play an audio file from the resource it is currently loaded or not. This decision is made using a conditional operator, and then the state is updated accordingly. The playBackInstance is holding the same value from the previous section when an audio file is loaded.

The next handler function handlePreviousTrack is used to skip back to the previous audio track in the playlist. It first clears the current track being played using unloadAsync from the Audio API, using the property value of currentIndex from the state. Similarly, the handler function handleNextTrack clears the current track and then using the currentIndex navigates to the next track.

Completing the Player UI

The last piece of the puzzle in this audio player app is to display the information of the audio file which is being played. This information is already provided in the mock API array audioBookPlaylist. Create a new function called renderFileInfo before the render function with the following JSX to display. Also, update the StyleSheet object.


renderFileInfo() {
    const { playbackInstance, currentIndex } = this.state
    return playbackInstance ? (
      <View style={styles.trackInfo}>
        <Text style={[styles.trackInfoText, styles.largeText]}>
          {audioBookPlaylist[currentIndex].title}
        </Text>
        <Text style={[styles.trackInfoText, styles.smallText]}>
          {audioBookPlaylist[currentIndex].author}
        </Text>
        <Text style={[styles.trackInfoText, styles.smallText]}>
          {audioBookPlaylist[currentIndex].source}
        </Text>
      </View>
    ) : null
  }
 
// update the Stylesheet object 
const styles = StyleSheet.create({
  container: {
    flex: 1,
    backgroundColor: '#fff',
    alignItems: 'center',
    justifyContent: 'center'
  },
  albumCover: {
    width: 250,
    height: 250
  },
  trackInfo: {
    padding: 40,
    backgroundColor: '#fff'
  },
  trackInfoText: {
    textAlign: 'center',
    flexWrap: 'wrap',
    color: '#550088'
  },
  largeText: {
    fontSize: 22
  },
  smallText: {
    fontSize: 16
  },
  control: {
    margin: 20
  },
  controls: {
    flexDirection: 'row'
  }
})

Next, use this function inside the render method of the App component below the view that holds all the control buttons. Also, update the control buttons to use appropriate handler functions from the previous section. Here is the complete code of the render function.

render() {
    return (
      <View style={styles.container}>
        <Image
          style={styles.albumCover}
          source={{ uri: 'http://www.archive.org/download/LibrivoxCdCoverArt8/hamlet_1104.jpg' }}
        />
        <View style={styles.controls}>
          <TouchableOpacity style={styles.control} onPress={this.handlePreviousTrack}>
            <Ionicons name='ios-skip-backward' size={48} color='#444' />
          </TouchableOpacity>
          <TouchableOpacity style={styles.control} onPress={this.handlePlayPause}>
            {this.state.isPlaying ? (
              <Ionicons name='ios-pause' size={48} color='#444' />
            ) : (
              <Ionicons name='ios-play-circle' size={48} color='#444' />
            )}
          </TouchableOpacity>
          <TouchableOpacity style={styles.control} onPress={this.handleNextTrack}>
            <Ionicons name='ios-skip-forward' size={48} color='#444' />
          </TouchableOpacity>
        </View>
        {this.renderFileInfo()}
      </View>
    )
  }

Now, run the application, and you will get the following result.

Conclusion

You have reached the end of this tutorial. We hope you enjoyed it and learned how to integrate the expo-av library to use an Audio class to create functionality in your cross-platform applications and build an audio player. An important thing to retain from this demo application is how to use available methods like loadAsync(), and unloadAsync().

I hope this tutorial will surely help and you if you liked this tutorial, please consider sharing it with others

Originally published on blog.jscrambler.com

#react-native #reactjs #javascript #web-development

What is GEEK

Buddha Community

Creating an Audio Player in React Native

Maria Nelson

1628835418

Hi Castor! I'm a newbie to react-native and programming. I tried implementing as you instructed and it worked! Thank you for this post. Also, a question, is it possible to change the icon from pause to play / or change the value of isPlaying to false automatically when the audio has completely played? Currently, the value of isPlaying only changes on the click of the button.

Autumn  Blick

Autumn Blick

1598839687

How native is React Native? | React Native vs Native App Development

If you are undertaking a mobile app development for your start-up or enterprise, you are likely wondering whether to use React Native. As a popular development framework, React Native helps you to develop near-native mobile apps. However, you are probably also wondering how close you can get to a native app by using React Native. How native is React Native?

In the article, we discuss the similarities between native mobile development and development using React Native. We also touch upon where they differ and how to bridge the gaps. Read on.

A brief introduction to React Native

Let’s briefly set the context first. We will briefly touch upon what React Native is and how it differs from earlier hybrid frameworks.

React Native is a popular JavaScript framework that Facebook has created. You can use this open-source framework to code natively rendering Android and iOS mobile apps. You can use it to develop web apps too.

Facebook has developed React Native based on React, its JavaScript library. The first release of React Native came in March 2015. At the time of writing this article, the latest stable release of React Native is 0.62.0, and it was released in March 2020.

Although relatively new, React Native has acquired a high degree of popularity. The “Stack Overflow Developer Survey 2019” report identifies it as the 8th most loved framework. Facebook, Walmart, and Bloomberg are some of the top companies that use React Native.

The popularity of React Native comes from its advantages. Some of its advantages are as follows:

  • Performance: It delivers optimal performance.
  • Cross-platform development: You can develop both Android and iOS apps with it. The reuse of code expedites development and reduces costs.
  • UI design: React Native enables you to design simple and responsive UI for your mobile app.
  • 3rd party plugins: This framework supports 3rd party plugins.
  • Developer community: A vibrant community of developers support React Native.

Why React Native is fundamentally different from earlier hybrid frameworks

Are you wondering whether React Native is just another of those hybrid frameworks like Ionic or Cordova? It’s not! React Native is fundamentally different from these earlier hybrid frameworks.

React Native is very close to native. Consider the following aspects as described on the React Native website:

  • Access to many native platforms features: The primitives of React Native render to native platform UI. This means that your React Native app will use many native platform APIs as native apps would do.
  • Near-native user experience: React Native provides several native components, and these are platform agnostic.
  • The ease of accessing native APIs: React Native uses a declarative UI paradigm. This enables React Native to interact easily with native platform APIs since React Native wraps existing native code.

Due to these factors, React Native offers many more advantages compared to those earlier hybrid frameworks. We now review them.

#android app #frontend #ios app #mobile app development #benefits of react native #is react native good for mobile app development #native vs #pros and cons of react native #react mobile development #react native development #react native experience #react native framework #react native ios vs android #react native pros and cons #react native vs android #react native vs native #react native vs native performance #react vs native #why react native #why use react native

Easter  Deckow

Easter Deckow

1655630160

PyTumblr: A Python Tumblr API v2 Client

PyTumblr

Installation

Install via pip:

$ pip install pytumblr

Install from source:

$ git clone https://github.com/tumblr/pytumblr.git
$ cd pytumblr
$ python setup.py install

Usage

Create a client

A pytumblr.TumblrRestClient is the object you'll make all of your calls to the Tumblr API through. Creating one is this easy:

client = pytumblr.TumblrRestClient(
    '<consumer_key>',
    '<consumer_secret>',
    '<oauth_token>',
    '<oauth_secret>',
)

client.info() # Grabs the current user information

Two easy ways to get your credentials to are:

  1. The built-in interactive_console.py tool (if you already have a consumer key & secret)
  2. The Tumblr API console at https://api.tumblr.com/console
  3. Get sample login code at https://api.tumblr.com/console/calls/user/info

Supported Methods

User Methods

client.info() # get information about the authenticating user
client.dashboard() # get the dashboard for the authenticating user
client.likes() # get the likes for the authenticating user
client.following() # get the blogs followed by the authenticating user

client.follow('codingjester.tumblr.com') # follow a blog
client.unfollow('codingjester.tumblr.com') # unfollow a blog

client.like(id, reblogkey) # like a post
client.unlike(id, reblogkey) # unlike a post

Blog Methods

client.blog_info(blogName) # get information about a blog
client.posts(blogName, **params) # get posts for a blog
client.avatar(blogName) # get the avatar for a blog
client.blog_likes(blogName) # get the likes on a blog
client.followers(blogName) # get the followers of a blog
client.blog_following(blogName) # get the publicly exposed blogs that [blogName] follows
client.queue(blogName) # get the queue for a given blog
client.submission(blogName) # get the submissions for a given blog

Post Methods

Creating posts

PyTumblr lets you create all of the various types that Tumblr supports. When using these types there are a few defaults that are able to be used with any post type.

The default supported types are described below.

  • state - a string, the state of the post. Supported types are published, draft, queue, private
  • tags - a list, a list of strings that you want tagged on the post. eg: ["testing", "magic", "1"]
  • tweet - a string, the string of the customized tweet you want. eg: "Man I love my mega awesome post!"
  • date - a string, the customized GMT that you want
  • format - a string, the format that your post is in. Support types are html or markdown
  • slug - a string, the slug for the url of the post you want

We'll show examples throughout of these default examples while showcasing all the specific post types.

Creating a photo post

Creating a photo post supports a bunch of different options plus the described default options * caption - a string, the user supplied caption * link - a string, the "click-through" url for the photo * source - a string, the url for the photo you want to use (use this or the data parameter) * data - a list or string, a list of filepaths or a single file path for multipart file upload

#Creates a photo post using a source URL
client.create_photo(blogName, state="published", tags=["testing", "ok"],
                    source="https://68.media.tumblr.com/b965fbb2e501610a29d80ffb6fb3e1ad/tumblr_n55vdeTse11rn1906o1_500.jpg")

#Creates a photo post using a local filepath
client.create_photo(blogName, state="queue", tags=["testing", "ok"],
                    tweet="Woah this is an incredible sweet post [URL]",
                    data="/Users/johnb/path/to/my/image.jpg")

#Creates a photoset post using several local filepaths
client.create_photo(blogName, state="draft", tags=["jb is cool"], format="markdown",
                    data=["/Users/johnb/path/to/my/image.jpg", "/Users/johnb/Pictures/kittens.jpg"],
                    caption="## Mega sweet kittens")

Creating a text post

Creating a text post supports the same options as default and just a two other parameters * title - a string, the optional title for the post. Supports markdown or html * body - a string, the body of the of the post. Supports markdown or html

#Creating a text post
client.create_text(blogName, state="published", slug="testing-text-posts", title="Testing", body="testing1 2 3 4")

Creating a quote post

Creating a quote post supports the same options as default and two other parameter * quote - a string, the full text of the qote. Supports markdown or html * source - a string, the cited source. HTML supported

#Creating a quote post
client.create_quote(blogName, state="queue", quote="I am the Walrus", source="Ringo")

Creating a link post

  • title - a string, the title of post that you want. Supports HTML entities.
  • url - a string, the url that you want to create a link post for.
  • description - a string, the desciption of the link that you have
#Create a link post
client.create_link(blogName, title="I like to search things, you should too.", url="https://duckduckgo.com",
                   description="Search is pretty cool when a duck does it.")

Creating a chat post

Creating a chat post supports the same options as default and two other parameters * title - a string, the title of the chat post * conversation - a string, the text of the conversation/chat, with diablog labels (no html)

#Create a chat post
chat = """John: Testing can be fun!
Renee: Testing is tedious and so are you.
John: Aw.
"""
client.create_chat(blogName, title="Renee just doesn't understand.", conversation=chat, tags=["renee", "testing"])

Creating an audio post

Creating an audio post allows for all default options and a has 3 other parameters. The only thing to keep in mind while dealing with audio posts is to make sure that you use the external_url parameter or data. You cannot use both at the same time. * caption - a string, the caption for your post * external_url - a string, the url of the site that hosts the audio file * data - a string, the filepath of the audio file you want to upload to Tumblr

#Creating an audio file
client.create_audio(blogName, caption="Rock out.", data="/Users/johnb/Music/my/new/sweet/album.mp3")

#lets use soundcloud!
client.create_audio(blogName, caption="Mega rock out.", external_url="https://soundcloud.com/skrillex/sets/recess")

Creating a video post

Creating a video post allows for all default options and has three other options. Like the other post types, it has some restrictions. You cannot use the embed and data parameters at the same time. * caption - a string, the caption for your post * embed - a string, the HTML embed code for the video * data - a string, the path of the file you want to upload

#Creating an upload from YouTube
client.create_video(blogName, caption="Jon Snow. Mega ridiculous sword.",
                    embed="http://www.youtube.com/watch?v=40pUYLacrj4")

#Creating a video post from local file
client.create_video(blogName, caption="testing", data="/Users/johnb/testing/ok/blah.mov")

Editing a post

Updating a post requires you knowing what type a post you're updating. You'll be able to supply to the post any of the options given above for updates.

client.edit_post(blogName, id=post_id, type="text", title="Updated")
client.edit_post(blogName, id=post_id, type="photo", data="/Users/johnb/mega/awesome.jpg")

Reblogging a Post

Reblogging a post just requires knowing the post id and the reblog key, which is supplied in the JSON of any post object.

client.reblog(blogName, id=125356, reblog_key="reblog_key")

Deleting a post

Deleting just requires that you own the post and have the post id

client.delete_post(blogName, 123456) # Deletes your post :(

A note on tags: When passing tags, as params, please pass them as a list (not a comma-separated string):

client.create_text(blogName, tags=['hello', 'world'], ...)

Getting notes for a post

In order to get the notes for a post, you need to have the post id and the blog that it is on.

data = client.notes(blogName, id='123456')

The results include a timestamp you can use to make future calls.

data = client.notes(blogName, id='123456', before_timestamp=data["_links"]["next"]["query_params"]["before_timestamp"])

Tagged Methods

# get posts with a given tag
client.tagged(tag, **params)

Using the interactive console

This client comes with a nice interactive console to run you through the OAuth process, grab your tokens (and store them for future use).

You'll need pyyaml installed to run it, but then it's just:

$ python interactive-console.py

and away you go! Tokens are stored in ~/.tumblr and are also shared by other Tumblr API clients like the Ruby client.

Running tests

The tests (and coverage reports) are run with nose, like this:

python setup.py test

Author: tumblr
Source Code: https://github.com/tumblr/pytumblr
License: Apache-2.0 license

#python #api 

Hire Dedicated React Native Developer

Have you ever thought of having your own app that runs smoothly over multiple platforms?

React Native is an open-source cross-platform mobile application framework which is a great option to create mobile apps for both Android and iOS. Hire Dedicated React Native Developer from top React Native development company, HourlyDeveloper.io to design a spectacular React Native application for your business.

Consult with experts:- https://bit.ly/2A8L4vz

#hire dedicated react native developer #react native development company #react native development services #react native development #react native developer #react native

Hire Dedicated React Native Developers - WebClues Infotech

Being one of the emerging frameworks for app development the need to develop react native apps has increased over the years.

Looking for a react native developer?

Worry not! WebClues infotech offers services to Hire React Native Developers for your app development needs. We at WebClues Infotech offer a wide range of Web & Mobile App Development services based o your business or Startup requirement for Android and iOS apps.

WebClues Infotech also has a flexible method of cost calculation for hiring react native developers such as Hourly, Weekly, or Project Basis.

Want to get your app idea into reality with a react native framework?

Get in touch with us.

Hire React Native Developer Now: https://www.webcluesinfotech.com/hire-react-native-app-developer/

For inquiry: https://www.webcluesinfotech.com/contact-us/

Email: sales@webcluesinfotech.com

#hire react native developers #hire dedicated react native developers #hire react native developer #hiring a react native developer #hire freelance react native developers #hire react native developers in 1 hour

Factors affecting the cost of hiring a React Native developer in USA - TopDevelopers.co

Want to develop app using React Native? Here are the tips that will help to reduce the cost of react native app development for you.
Cost is a major factor in helping entrepreneurs take decisions about investing in developing an app and the decision to hire react native app developers in USA can prove to be fruitful in the long run. Using react native for app development ensures a wide range of benefits to your business. Understanding your business and working on the aspects to strengthen business processes through a cost-efficient mobile app will be the key to success.

#best react native development companies from the us #top react native app development companies in usa #cost of hiring a react native developer in usa #top-notch react native developer in usa #best react native developers usa #react native