I’ve recently written a small, AI-powered to-do app. The app is a simple list of to-do items that you can delete from your list — your standard to-do list app with one difference: the app recognises a user’s input text and automatically appends a relevant image next to it.

As an example, if the user adds and a to-do item, “buy bananas,” the app will classify the word bananas, find and download the image of bananas. and append it next to the to-do item. It looks like a bit of magic, but it’s a pretty simple thing to do using Apple’s native Natural Language framework. (Reference to open-source App repository is at the bottom of this article.)

Want to know how I’ve done that? Read further.

Using Apple’s native Natural Language framework (available in Swift and Objective-C), you can extract a text’s lexical class (nouns, verbs, pronouns, and adjectives), the language of input text, and even a sentiment score that tells how positive/negative the inputted text is.

Now my example is a little to-do app, but you can easily use the framework in many more ways to enrich user experience in your app. As an example, how helpful would it be for you to know your user’s sentiment score based on their input text? You could customise the way your app responds if a user is feeling positive/negative.

Alternatively, you could do something similar to what I’ve done in my little app. You could use input pronouns/nouns as a search term using a remote image-search API to fetch a relevant image to display next to the user’s input item.

Code speaks more than an essay about it, so lets have a look at two examples.

#mobile #swift #xcode #programming #artificial-intelligence

How to Analyse Language Using ‘NLTagger’ in Swift
2.15 GEEK