JavaScript Functional Programming Explained: Fusion & Transduction

JavaScript Functional Programming Explained: Fusion & Transduction

Fusion and transduction are probably the most “blow-my-mind” practical tools I’ve picked up in my time studying Functional Programming.

Fusion and transduction are probably the most “blow-my-mind” practical tools I’ve picked up in my time studying Functional Programming.

They’re not tools I use every day, nor are they strictly ever necessary…But, they completely changed the way I think about programming, modularity, and abstraction in software engineering, permanently, and for the better.

And to be clear, that’s really the point of both this article in particular, and series in general: Not to evangelize FP, offer a silver bullet, or illuminate some magic secret sauce that’s “better” than what you do now. Rather, the point is to shed light on different ways of thinking about programming, and broaden your sense of possible solutions to everyday problems.

Table of Contents

These aren’t easy techniques to use fluently, and it’ll probably take some time, tinkering, and deliberate practice to fully comprehend what’s going on here. This isn’t a read-and-grok concept: For most, it’s an entirely new level of abstraction.

But, if you put in the time, you might just come out with the sharpest sense of abstractions around functions that you’ve ever had.

In other words: This is going to make you a helluva lot smarter, if you put in the work.

If that sounds like your bag…Let’s get straight to it.

A Quick Example

Recall the definition of pure functions as functions which have no side effects, and always returns the same value for any given input.

Since pure functions always return the same value for a given input1, we can safely pass their return values directly to

This enables such niceties as:

// niceties
colorBackground(wrapWith(makeHeading(createTitle(movie))), 'div')), 'papayawhip')

Here, we use makeHeading to create a string heading out of movie; use this string to create a new heading (makeHeading delegates to document.createElement); wrap this heading in a div; and finally, call colorBackground, which updates the element’s styling to set a background of papayawhip…Which is my favorite flavor of CSS.

Let’s be explicit about the composition at work in this snippet.

At each step of the pipeline, a function accepts an input, and returns an output, which the input determines completely.

More formally: At each step, we add another referentially transparent function to the pipeline.

Yet more formally: papayaWhipHeading is a composition of referentially transparent functions.

It’s worth pointing out that a functional eye might spot the below possibility, as well.

…But you’re not here for illustrative-yet-contrived examples.

You’re here to learn about fusion.

And I’m here to talk about it.

So…Let’s dash through the rest of those prerequisites, and look at chaining Array methods, next.

Chaining map and filter expressions

One of the nicer features of map is that it automatically returns an array with its results.

const capitalized = ["where's", 'waldo'].map(function(word) {
  return word.toUpperCase();
});

console.log(capitalized); // ['WHERE\'S', 'WALDO']

Of course, there’s nothing particularly special about capitalized. It has all the same methods any other array does.

Since map and filter return arrays, we can chain calls to either method directly to their return values.

const screwVowels = function(word) {
  return word.replace(/[aeiuo]/gi, '');
};

// Calling map on the result of calling map

const capitalizedTermsWithoutVowels = ["where's", 'waldo']
  .map(String.prototype.toUpperCase)
  .map(screwVowels);

This isn’t a particularly dramatic result: Chained array methods like this are common in JS-land. But, it merits attention for its leading to code like the below:

// Retrieve a series of 'posts' from JSON Placeholder (for fake demonstration data)
// GET data
fetch('https://jsonplaceholder.typicode.com/posts')
  // Extract POST data from response
  .then(data => data.json())
  // This callback contains the code you should focus on--the above is boilerplate
  .then(data => {
    // filter for posts by user with userId == 1
    const sluglines = data
      .filter(post => post.userId == 1)
      // Extract only post and body properties
      .map(post => {
        const extracted = {
          body: post.body,
          title: post.title
        };

        return extracted;
      })
      // Truncate "body" to first 17 characters, and add 3-character ellipsis
      .map(extracted => {
        extracted.body = extracted.body.substring(0, 17) + '...';
        return extracted;
      })
      // Capitalize title
      .map(extracted => {
        extracted.title = extracted.title.toUpperCase();
        return extracted;
      })
      // Create sluglines
      .map(extracted => {
        return `${extracted.title}\n${extracted.body}`;
      });
  });

This is maybe a few more map calls than is common, sure…But, consider map alongside filter, and this style becomes a lot more believable.

Using simple, “single-purpose” callbacks in sequential calls to map and filter lets us write simpler code, at the cost of overhead due to function invocation and the requirement for “single-purpose” callbacks.

We also enjoy the benefi of immutability, as map and filter don’t modify the array you call them on. Rather, they create new arrays each time.

This lets us avoid confusion due to subtle side effects, and preserves the integrity of our initial data source, allowing us to pass it to multiple processing pipelines without issue.

Intermediate Arrays

On the other hand, allocating a whole new array on every invocation of map or filter seems…A little heavy-handed.

The sequence of calls we made above feels a bit “heavy-handed”, because we only care about the array we get after we make all of our calls to map and filter. The intermediate arrays we generate along the way are throwaways.

…Literally. We create them for the sole purpose of providing the next function in the chain with data in the format it expects. We only hang on to the last array we generate. The JavaScript engine eventually garbage collects the intermediate arrays that we built up but didn’t need.

If you’re using this style of programming to process large lists, this can lead to considerable memory overhead.

In other words: We’re trading memory and some incidental code complexity for testability and readability.

Eliminating Intermediate Arrays

For simplicity, let’s consider a sequence of calls to map.

// See bottom of snippet for `users` list
users
  // Extract important information...
  .map(function (user) {
      // Destructuring: https://jsonplaceholder.typicode.com/users
      return { name, username, email, website } = user
  })
  // Build string... 
  .map(function (reducedUserData) {
    // New object only has user's name, username, email, and website
    // Let's reformat this data for our component
    const { name, username, email, website } = reduceduserdata
    const displayname = `${username} (${name})`
    const contact = `${website} (${email})`

    // Build the string want to drop into our UserCard component
    return `${displayName}\n${contact}`
  })
  // Build components...
  .map(function (displayString) {
      return UserCardComponent(displayString)
  })

// Hoisting so we can keep the important part of this snippet at the top
var users = [
    {
    "id": 1,
    "name": "Leanne Graham",
    "username": "Bret",
    "email": "[email protected]",
    "address": {
      "street": "Kulas Light",
      "suite": "Apt. 556",
      "city": "Gwenborough",
      "zipcode": "92998-3874",
      "geo": {
        "lat": "-37.3159",
        "lng": "81.1496"
      }
    },
    "phone": "1-770-736-8031 x56442",
    "website": "hildegard.org",
    "company": {
      "name": "Romaguera-Crona",
      "catchPhrase": "Multi-layered client-server neural-net",
      "bs": "harness real-time e-markets"
    }
  },
  {
    "id": 2,
    "name": "Ervin Howell",
    "username": "Antonette",
    "email": "[email protected]",
    "address": {
      "street": "Victor Plains",
      "suite": "Suite 879",
      "city": "Wisokyburgh",
      "zipcode": "90566-7771",
      "geo": {
        "lat": "-43.9509",
        "lng": "-34.4618"
      }
    }
  }
]

To restate the problem: This produces an intermediate, “throwaway” array with every invocation of map.

This implies we don’t allocate intermediate arrays if we can find a way to execute all of our processing logic, but only invoke map once.

One way to get away with a single call to map is to do all of our work inside of a single callback.

const userCards = users.map(function (user) {
    // Destructure user we're interested in...
    const { name, username, email, website } = user

    const displayName = `${username} (${name})`
    const contact = `${website} (${email})`

    // Create display string for our component...
    const displayString = `${displayName}\n${contact}`

    // Build/return UserCard
    return UserCard(displayString)
})

This eliminates intermediate arrays…But, this is a step backwards. Throwing everything into a single callback loses the readability and testability benefits that motivated sequenced calls to map in the first place.

One way to improve the readability of this version is to extract the callbacks into their own functions, and use them within the call to map, rather than the literal function declarations.

const extractUserData = function (user) {
    return { name, username, email, website } = user
}

const buildDisplayString = function (userData) {
    const { name, username, email, website } = reducedUserData
    const displayName = `${username} (${name})`
    const contact = `${website} (${email})`

    return `${displayName}\n${contact}`
}

const userCards = users.map(function (user) {
    const adjustedUserData = extractUserData(user)
    const displayString = buildDisplayString(adjustedUserData)
    const userCard = UserCardComponent(displayString)

    return userCard
})

This is logically equivalent to what we started with, due to referential transparency (!). But, it’s definitely easier to read, and arguably easier to test.

The real victory here is that this version makes the structure of our processing logic much clearer: This should be ringing bells—sounds like function composition, doesn’t it?

We can go one step further. Instead of saving the result of each function invocation to a variable, we can simply pass the result of each call directly to the next function in the sequence..

So…How about:

const userCards = users.map(function (user) {
    const userCard = UserCardComponent(buildDisplayString(extractUserData(user)))
    return userCard
})

…And, because I’m vain and like my code pretty and terse:

const userCards = 
  users.map(user => UserCardComponent(buildDisplayString(extractUserData(user))))

Composition & Fusion

This restores all of the testability, and some of the readability, of our original chain of map calls.

And, since we’ve managed to express this transformation with only a single call to map, we’ve eliminated the memory overhead imposed by intermediate arrays.

We did this by converting our sequence of calls to map, each of which received a “single-purpose” callback, into a single call to map, in which we use a composition of those callbacks.

This process is called fusion, and allows us to avoid the overhead of intermediate arrays while enjoying the testability and readability benefits of sequenced calls to map.

Explicit is better than implicit. ~ The Zen of Python
One last improvement. Let’s take a cue from that other language, and be explicit about what we’re doing.

const R = require('ramda');

// Use composition to use "single-purpose" callbacks to define a single transformation function
const buildUsercard = R.compose(UserCardComponent, buildDisplayString, extractUserData)

// Generate our list of user components
const userCards = users.map(buildUserCard)

We can write a helper to make this even cleaner.

const R = require('ramda')

const fuse = (list, functions) => list.map(R.compose(...functions))

// Then...
const userCards = fuse(
    // list to transform
    users, 
    // functions to apply
    [UserCardComponent, buildDisplayString, extractUserData]
)

It works with filter, too.

//TODO: Example with filter

It’s like having your cake and eating it, too.

…Except this cake improves your memory profile.

Some cake.

Meltdown

If you’re like me, this is the part where you start using map and filter just everywhere.

Even stuff you probably shouldn’t use it for.

Just for the excuse to fuse everything in sight.

But, the high doesn’t last long with this one. Check this:

users
  // Today, I've decided I hate the letter a
  .filter(function (user) {
      return user.name[0].toLowerCase() == 'a'
  })
  .map(function (user) {
      const { name, email } = user
      return `${name}'s email address is: ${email}.`
  })

Fusion works fine with a sequence of map calls. It works just as well with a sequence of calls to filter.

Unfortunately, it breaks with sequential calls involving both methods. Fusion only works for sequenced calls to one of these methods.

That’s because they interpret the return values of their callbacks differently. map takes the return value and pushes it into an array, regardless as to what it is.

filter, on the other hand, interprets the truthiness of the callback’s return value. If the callback returns true for an element, it keeps that element. Otherwise, it throws it out.

Fusion doesn’t work because there’s no way to tell the fused function which callbacks should be used as filters, and which should be used as simple transformations.

In other words: This approach to fusion only works in the (very) special case of a sequence of calls to map and filter.

So…Game over?

Transduction

As we’ve seen, fusion only works for a series of calls only involving map, or only involving filter.

This isn’t very helpful in practice, where we’ll typically invoke both.

If you tear your hair out about it long enough, you’ll realize that there’s a trick. Recall that we were able to express map and filter in terms of reduce.

// Expressing `map` in terms of `reduce`
const map = (list, mapFunction) => {
    const output = list.reduce((transformedList, nextElement) => {
        // use the mapFunction to transform the nextElement in the list 
        const transformedElement = mapFunction(nextElement);

        // add transformedElement to our list of transformed elements
        transformedList.push(transformedElement);

        // return list of transformed elements
        return transformedList;
    }, [])
    // ^ start with an empty list

    return output;
}

// Expressing `filter` in terms of `reduce`
const filter = (list, predicate) => {
    const output = list.reduce(function (filteredElements, nextElement) {
        // only add `nextElement` if it passes our test
        if (predicate(nextElement)) {
            filteredElements.push(nextElement);
        }

        // return the list of filtered elements on each iteration
        return filteredElements;
        }, [])
    })
}

In theory, this means that we can replace our calls to map and then filter with calls to reduce. Then, we’d have a chain of calls involving only reduce, but which implements the same mapping/filtering logic we’re already using.

From there, we can apply a technique very similar to what we’ve seen with fusion to express our series of reductions in terms of a single function composition.

Step 1: mapReducer and filterReducer

The first step is to re-express our calls to map and filter in terms of reduce.

Previously, we wrote our own versions of map and filter, which looked like this:

const mapReducer = (list, mapFunction) => {
    const output = list.reduce((transformedList, nextElement) => {
        // use the mapFunction to transform the nextElement in the list 
        const transformedElement = mapFunction(nextElement);

        // add transformedElement to our list of transformed elements
        transformedList.push(transformedElement);

        // return list of transformed elements
        return transformedList;
    }, [])
    // ^ start with an empty list

    return output;
}

const filterReducer = (list, predicate) => {
    const output = list.reduce(function (filteredElements, nextElement) {
        // only add `nextElement` if it passes our test
        if (predicate(nextElement)) {
            filteredElements.push(nextElement);
        }

        // return the list of filtered elements on each iteration
        return filteredElements;
        }, [])
    })
}

We used these to demonstrate the relationship between reduce and map/filter.

But we need to make some changes if we want to use this in reduce chains.

Let’s start by removing those calls to reduce:

const mapReducer = mapFunction => (transformedList, nextElement) => {
    const transformedElement = mapFunction(nextElement);

    transformedList.push(transformedElement);

    return transformedList;
}

const filterReducer = predicate => (filteredElements, nextElement) => {
    if (predicate(nextElement)) {
        filteredElements.push(nextElement);
    }

    return filteredElements;
}

Note that: Earlier, we filtered and mapped an array of user names. Let’s start rewriting that logic with these new functions to make all this a little less abstract.

// filter's predicate function
function removeNamesStartingWithA (user) {
    return user.name[0].toLowerCase() != 'a'
}

// map's transformation function
function createUserInfoString (user) {
    const { name, email } = user
    return `${name}'s email address is: ${email}.`
}

users
  .reduce(filterReducer(removeNamesStartingWithA), [])
  .reduce(mapReducer(createUserInfoString), [])

This produces the same result as our previous filter/map chain.

Recapping: This is quite a few layers of indirection involved. Take some time to step through the above snippet before moving on.

Step 2: Generalizing Our Folding Function

Take another look at mapReducer and filterReducer.

const mapReducer = mapFunction => (transformedList, nextElement) => {
    const transformedElement = mapFunction(nextElement);

    transformedList.push(transformedElement);

    return transformedList;
}

const filterReducer = predicate => (filteredElements, nextElement) => {
    if (predicate(nextElement)) {
        filteredElements.push(nextElement);
    }

    return filteredElements;
}

Rather than hard-code transformation or predicate logic, we allow the user to pass in mapping and predicate functions as arguments, which the partial applications of mapReducer and filterReducer remember due to closure.

This way, we can use mapReducer and filterReducer as “backbones” in building arbitrary reduction chains by passing the predicate or mapFunction appropriate for our use case.

If you look closely, you’ll notice that we still make explicit calls to push in both of these reducers. This is important, because push is the function that allows us to combine, or reduce, two objects into one:

// Object 1...
const accumulator = ["an old element"];

// Object 2...
const next_element = "a new element";

// A single object that combines both! Eureka!
accumulator.push(next_element);

// ["an old element", "a new element"]
console.log(accumulator)

Recall that combining elements like this is the whole point of using reduce in the first place.

If you think about it, push isn’t the only function we can use to do this. We could use unshift, instead:

// Object 1...
const accumulator = ["an old element"];

// Object 2...
const next_element = "a new element";

// A single object that combines both! Eureka!
accumulator.unshift(next_element);

// ["a new element", "an old element"]
console.log(accumulator);

As written, our reducers lock us into using push. If we wanted to unshift, instead, we’d have to re-implement mapReducer and filterReducer.

The solution: Abstraction.

Rather than hard-code push, we’ll let the user pass the function they want to use to combine elements as an argument.

const mapReducer = combiner => mapFunction => (transformedList, nextElement) => {
    const transformedElement = mapFunction(nextElement);

    transformedList = combiner(transformedList, transformedElement);

    return transformedList;
}

const filterReducer = combiner => predicate => (filteredElements, nextElement) => {
    if (predicate(nextElement)) {
        filteredElements = combiner(filteredElements, nextElement);
    }

    return filteredElements;
}

We use it like this:

// push element to list, and return updated list
const pushCombiner = (list, element) => {
    list.push(element);
    return list;
}

const mapReducer = mapFunction => combiner => (transformedList, nextElement) => {
    const transformedElement = mapFunction(nextElement);

    transformedList = combiner(transformedList, transformedElement);

    return transformedList;
}

const filterReducer = predicate => combiner => (filteredElements, nextElement) => {
    if (predicate(nextElement)) {
        filteredElements = combiner(filteredElements, nextElement);
    }

    return filteredElements;
}

users
  .reduce(
      filterReducer(removeNamesStartingWithA)(pushCombiner), [])
  .reduce(
      mapReducer(createUserInfoString)(pushCombiner), [])

Step 3: Transduction

At this point, everything’s in place for our final trick: Composing these transformations to fuse those chained calls to reduce.

Let’s see it in action first, and then review.

const R = require('ramda');

// final mapReducer/filterReducer functions
const mapReducer = mapFunction => combiner => (transformedList, nextElement) => {
    const transformedElement = mapFunction(nextElement);

    transformedList = combiner(transformedList, transformedElement);

    return transformedList;
}

const filterReducer = predicate => combiner => (filteredElements, nextElement) => {
    if (predicate(nextElement)) {
        filteredElements = combiner(filteredElements, nextElement);
    }

    return filteredElements;
}

// push element to list, and return updated list
const pushCombiner = (list, element) => {
    list.push(element);
    return list;
}

// filter's predicate function
const removeNamesStartingWithA = user => {
    return user.name[0].toLowerCase() != 'a'
}

// map's transformation function
const createUserInfoString = user => {
    const { name, email } = user
    return `${name}'s email address is: ${email}.`
}

// use composition to create a chain of functions for fusion (!)
const reductionChain = R.compose(
    filterReducer(removeNamesStartingWithA)
    mapReducer(createUserInfoString),
)

users
  .reduce(reductionChain(pushCombiner), [])

We can go one further by implementing a helper function.

const transduce = (input, initialAccumulator, combiner, reducers) => {
    const reductionChain = R.compose(...reducers);
    return input.reduce(reductionChain(combiner), initialAccumulator)
}

const result = transduce(users, [], pushCombiner, [
    filterReducer(removeNamesStartingWithA)
    mapReducer(createUserInfoString),
]);

…And that, as they say, is that.

Takeaways & Next Steps

This was a bit of a whirlwind, so it pays to step back from the details and consider a summary and next steps.

So, in summary:

As for next steps and more advanced concepts:

…But, at the end of the day, I think the most important takeaway is this: There are more solutions to almost any problem than anyone could ever enumerate; the more of them you meet, the clearer you’ll think about your own, and the more fun you’ll have doing so.

I hope meeting Fusion and Transduction piques your interest, helps you think more clearly, and, ambitious as it is, was at least a little bit fun.

As always, questions, comments, and feedback are encouraged—drop a line.

And don’t forget to have some fun.

Learn More

Machine Learning in JavaScript with TensorFlow.js

ES5 to ESNext — here’s every feature added to JavaScript since 2015

12 Concepts That Will Level Up Your JavaScript Skills

Vuejs 2 Authentication Tutorial

Vue Authentication And Route Handling Using Vue-router

The Complete JavaScript Course 2019: Build Real Projects

Become a JavaScript developer - Learn (React, Node,Angular)

JavaScript: Understanding the Weird Parts

Vue JS 2 - The Complete Guide (incl. Vue Router & Vuex)

The Full JavaScript & ES6 Tutorial - (including ES7 & React)

JavaScript Tutorial: if-else Statement in JavaScript

JavaScript Tutorial: if-else Statement in JavaScript

This JavaScript tutorial is a step by step guide on JavaScript If Else Statements. Learn how to use If Else in javascript and also JavaScript If Else Statements. if-else Statement in JavaScript. JavaScript's conditional statements: if; if-else; nested-if; if-else-if. These statements allow you to control the flow of your program's execution based upon conditions known only during run time.

Decision Making in programming is similar to decision making in real life. In programming also we face some situations where we want a certain block of code to be executed when some condition is fulfilled.
A programming language uses control statements to control the flow of execution of the program based on certain conditions. These are used to cause the flow of execution to advance and branch based on changes to the state of a program.

JavaScript’s conditional statements:

  • if
  • if-else
  • nested-if
  • if-else-if

These statements allow you to control the flow of your program’s execution based upon conditions known only during run time.

  • if: if statement is the most simple decision making statement. It is used to decide whether a certain statement or block of statements will be executed or not i.e if a certain condition is true then a block of statement is executed otherwise not.
    Syntax:
if(condition) 
{
   // Statements to execute if
   // condition is true
}

Here, condition after evaluation will be either true or false. if statement accepts boolean values – if the value is true then it will execute the block of statements under it.
If we do not provide the curly braces ‘{‘ and ‘}’ after if( condition ) then by default if statement will consider the immediate one statement to be inside its block. For example,

if(condition)
   statement1;
   statement2;

// Here if the condition is true, if block 
// will consider only statement1 to be inside 
// its block.

Flow chart:

Example:

<script type = "text/javaScript"> 

// JavaScript program to illustrate If statement 

var i = 10; 

if (i > 15) 
document.write("10 is less than 15"); 

// This statement will be executed 
// as if considers one statement by default 
document.write("I am Not in if"); 

< /script> 

Output:

I am Not in if
  • if-else: The if statement alone tells us that if a condition is true it will execute a block of statements and if the condition is false it won’t. But what if we want to do something else if the condition is false. Here comes the else statement. We can use the else statement with if statement to execute a block of code when the condition is false.
    Syntax:
if (condition)
{
    // Executes this block if
    // condition is true
}
else
{
    // Executes this block if
    // condition is false
}


Example:

<script type = "text/javaScript"> 

// JavaScript program to illustrate If-else statement 

var i = 10; 

if (i < 15) 
document.write("10 is less than 15"); 
else
document.write("I am Not in if"); 

< /script> 

Output:

i is smaller than 15
  • nested-if A nested if is an if statement that is the target of another if or else. Nested if statements means an if statement inside an if statement. Yes, JavaScript allows us to nest if statements within if statements. i.e, we can place an if statement inside another if statement.
    Syntax:
if (condition1) 
{
   // Executes when condition1 is true
   if (condition2) 
   {
      // Executes when condition2 is true
   }
}

Example:

<script type = "text/javaScript"> 

// JavaScript program to illustrate nested-if statement 

var i = 10; 

if (i == 10) { 

// First if statement 
if (i < 15) 
	document.write("i is smaller than 15"); 

// Nested - if statement 
// Will only be executed if statement above 
// it is true 
if (i < 12) 
	document.write("i is smaller than 12 too"); 
else
	document.write("i is greater than 15"); 
} 
< /script> 

Output:

i is smaller than 15
i is smaller than 12 too
  • if-else-if ladder Here, a user can decide among multiple options.The if statements are executed from the top down. As soon as one of the conditions controlling the if is true, the statement associated with that if is executed, and the rest of the ladder is bypassed. If none of the conditions is true, then the final else statement will be executed.
if (condition)
    statement;
else if (condition)
    statement;
.
.
else
    statement;


Example:

<script type = "text/javaScript"> 
// JavaScript program to illustrate nested-if statement 

var i = 20; 

if (i == 10) 
document.wrte("i is 10"); 
else if (i == 15) 
document.wrte("i is 15"); 
else if (i == 20) 
document.wrte("i is 20"); 
else
document.wrte("i is not present"); 
< /script> 

Output:

i is 20

How to Retrieve full Profile of LinkedIn User using Javascript

How to Retrieve full Profile of LinkedIn User using Javascript

I am trying to retrieve the full profile (especially job history and educational qualifications) of a linkedin user via the Javascript (Fetch LinkedIn Data Using JavaScript)

Here we are fetching LinkedIn data like Username, Email and other fields using JavaScript SDK.

Here we have 2 workarounds.

  1. Configuration of linkedIn developer api
  2. Javascript Code to fetch records

Configuration of linkedIn developer api

In order to fetch records, first we need to create developer api in linkedin which will act as token/identity while fetching data from other linkedin accounts.

So to create api, navigate to https://linkedin.com/developer/apps and click on 'Create Application'.

After navigating, fill in details like name, description and other required fields and then submit.

As we submit, it will create Client ID and Client Secret shown below, which we will be using in our code while communicating to fetch records from other LinkedIn account.

Note: We need to provide localhost Url here under Oauth 2.0. I am using my localhost, but you can probably use other production URLs under Oauth 2.0 where your app is configured. It will make your api  consider the Url as trusted which fetching records.

Javascript Code to fetch records

For getting user details like first name, last name,User image can be written as,

<script type="text/javascript" src="https://platform.linkedin.com/in.js">  
    api_key: XXXXXXX //Client ID  
    onLoad: OnLinkedInFrameworkLoad //Method that will be called on page load  
    authorize: true  
</script>  
<script type="text/javascript">  
    function OnLinkedInFrameworkLoad() {  
        IN.Event.on(IN, "auth", OnLinkedInAuth);  
    }  
  
    function OnLinkedInAuth() {  
        IN.API.Profile("me").result(ShowProfileData);  
    }  
  
    function ShowProfileData(profiles) {  
        var member = profiles.values[0];  
        var id = member.id;  
        var firstName = member.firstName;  
        var lastName = member.lastName;  
        var photo = member.pictureUrl;  
        var headline = member.headline;  
        //use information captured above  
        var stringToBind = "<p>First Name: " + firstName + " <p/><p> Last Name: " + lastName + "<p/><p>User ID: " + id + " and Head Line Provided: " + headline + "<p/>"  
        document.getElementById('profiles').innerHTML = stringToBind;  
    }  
</script>    

Kindly note we need to include 'https://platform.linkedin.com/in.js' as src under script type as it will act on this Javascript SDK provided by Linkedin.

In the same way we can also fetch records of any organization with the companyid as keyword.

<head>  
    <script type="text/javascript" src="https://platform.linkedin.com/in.js">  
        api_key: XXXXXXX ////Client ID  
        onLoad: onLinkedInLoad  
        authorize: true  
    </script>  
</head>  
  
<body>  
    <div id="displayUpdates"></div>  
    <script type="text/javascript">  
        function onLinkedInLoad() {  
            IN.Event.on(IN, "auth", onLinkedInAuth);  
            console.log("On auth");  
        }  
  
        function onLinkedInAuth() {  
            var cpnyID = XXXXX; //the Company ID for which we want updates  
            IN.API.Raw("/companies/" + cpnyID + "/updates?event-type=status-update&start=0&count=10&format=json").result(displayCompanyUpdates);  
            console.log("After auth");  
        }  
  
        function displayCompanyUpdates(result) {  
            var div = document.getElementById("displayUpdates");  
            var el = "<ul>";  
            var resValues = result.values;  
            for (var i in resValues) {  
                var share = resValues[i].updateContent.companyStatusUpdate.share;  
                var isContent = share.content;  
                var isTitled = isContent,  
                    isLinked = isContent,  
                    isDescription = isContent,  
                    isThumbnail = isContent,  
                    isComment = isContent;  
                if (isTitled) {  
                    var title = isContent.title;  
                } else {  
                    var title = "News headline";  
                }  
                var comment = share.comment;  
                if (isLinked) {  
                    var link = isContent.shortenedUrl;  
                } else {  
                    var link = "#";  
                }  
                if (isDescription) {  
                    var description = isContent.description;  
                } else {  
                    var description = "No description";  
                }  
                /* 
                if (isThumbnailz) { 
                var thumbnailUrl = isContent.thumbnailUrl; 
                } else { 
                var thumbnailUrl = "http://placehold.it/60x60"; 
                } 
                */  
                if (share) {  
                    var content = "<a target='_blank' href=" + link + ">" + comment + "</a><br>";  
                    //el += "<li><img src='" + thumbnailUrl + "' alt=''>" + content + "</li>";  
                    el += "<li><div>" + content + "</div></li>";  
                }  
                console.log(share);  
            }  
            el += "</ul>";  
            document.getElementById("displayUpdates").innerHTML = el;  
        }  
    </script>  
</body>  

We can get multiple metadata while fetching records for any any organization. We can get company updates as shown below.

Conclusion

We can also fetch any company specific data like company job updates/post, total likes, comments, and number of views along with a lot of metadata we can fetch which I have shown below.

Thank you for reading !

7 Best Javascript Iframe Libraries

7 Best Javascript Iframe Libraries

Iframes let you build user experiences into embeddable ‘cross-domain components’, which let users interact with other sites without being redirected. I have compiled 7 best Javascript iframe libraries.

Iframes let you build user experiences into embeddable ‘cross-domain components’, which let users interact with other sites without being redirected. I have compiled 7 best Javascript iframe libraries.

1. Zoid

A cross-domain component toolkit, supporting:

  • Render an iframe or popup on a different domain, and pass down props, including objects and functions
  • Call callbacks natively from the child window without worrying about post-messaging or cross-domain restrictions
  • Create and expose components to share functionality from your site to others!
  • Render your component directly as a React, Vue or Angular component!
    It's 'data-down, actions up' style components, but 100% cross-domain using iframes and popups!

Download


2. Postmate

Postmate is a promise-based API built on postMessage. It allows a parent page to speak with a child iFrame across origins with minimal effort.

Download


3. Iframe Resizer

Keep same and cross domain iFrames sized to their content with support for window/content resizing, in page links, nesting and multiple iFrames

Demo

Download


4. Iframely

Embed proxy. Supports over 1800 domains via custom parsers, oEmbed, Twitter Cards and Open Graph

Demo

Download


5. React Frame component

This component allows you to encapsulate your entire React application or per component in an iFrame.

Demo

Download


6. Seamless.js

A seamless iframe makes it so that visitors are unable to distinguish between content within the iframe and content beside the iframe. Seamless.js is a JavaScript library (with no dependencies) that makes working with iframes easy by doing all the seamless stuff for you automatically.

Demo

Download


7. Porthole

A proxy to safely communicate to cross-domain iframes in javascript

Demo

Download


Thank for read!