Summary

Mark Rendle talks about the various technologies and standards from across the years, the pros and cons of each, and which solutions are appropriate for which problems. He looks at ways of migrating legacy APIs from old technologies to the modern alternatives.

Bio

Mark Rendle is developing a new commercial product for .NET teams, Visual ReCode, which rescues code from obsolete frameworks like WCF and migrates it to current ones like gRPC. He is the co-author of the Microsoft e-book ASP.NET Core gRPC for WCF Developers, and a popular speaker at conferences around the world.

About the conference

Software is changing the world. QCon empowers software development by facilitating the spread of knowledge and innovation in the developer community. A practitioner-driven conference, QCon is designed for technical team leads, architects, engineering directors, and project managers who influence innovation in their teams.

Transcript

Rendle: I’m Mark Rendle. I’m here at the start of the API track to talk a little bit about the history and how we got to where we are today, and where we might be going next, and show you a little bit of code along the way. Generally, get everyone set up for the day. Who’s planning on spending most of the day in this track? There should be some interesting stuff. I’m expecting lots of different people to come along and say this is the right way to build an API. Then at the end of the day, you’ll have eight right ways to build an API, and you can just go away and choose which one you liked the best.

Brief history, I will try and explain how we got to where we are from the 1970s, which is when really programming got started. People were doing stuff with COBOL and Fortran. We were starting to get into high-level languages like C and everything else.

Application Programming Interfaces (APIs)

We had to create application programming interfaces. We had to create ways that if we made a piece of software, whether that was an order management, stock control, usually finance back in those days, banks were early adopters. They had lots of different applications dealing with different things. Those applications needed to be able to talk to each other. They had to create APIs. We’ve been using APIs, as programmers. Pretty much everything we use could be considered an API. Even the programming languages that we write code in, whether that’s C, or C#, or Java, or Python. There’s a piece of software which might be a compiler, or it might be a dynamic Language runtime. We are using the API for that compiler or dynamic language runtime, to use that to create our software. Then everything we interact with, whether that’s a database, or a messaging system, or a queue, that has an API that we talk to as well. Then of course, we got the higher-level APIs now of talking to other services in service oriented architecture or microservices.

Back in the 1970s, things were largely running on mainframes. That’s very few people, very few organizations in those days would have had more than one mainframe. That wasn’t the point, mainframe, is you bought a mainframe. If you needed more processing power, you bought a bigger mainframe, or you made your mainframe bigger because you could just buy another couple of fridges and tack it on to the end and add another 2 MBs of disk space.

Just a little bit about me. This is actually my 31st year as a professional software developer. I’ve been doing this since I was 16. I started on UNIX systems with Wyse terminals. My first day at work, they said, “Learn C.” They gave me the Kernighan and Ritchie book with Hello World, and a Wyse terminal. I sat down. I said, “How do you edit files?” They said, “Use vi.” Then I spent six months learning vi.

The mainframes that existed, they had lots of difference. They had time sharing systems. They had multiple processes running. Those processes needed a way to talk to each other. One of the first examples of an inter-process API was Message Oriented Middleware, or MOM, which was pioneered by IBM in the 1970s. Anything that was pioneered in the 1970s, you can pretty much assume it was IBM pioneering it. Message Oriented Middleware was what we would think of now was a RabbitMQ, really, but with a little bit of intelligence so that something could send a message. Then that middleware could transform it, and then send it on somewhere else. That might get a response back. It was very asynchronous. It’s the thing that we do today and we think we’re being very clever, using Kafka, or RabbitMQ, or AMQP. We’re basically doing the same thing that people on mainframes were doing with COBOL and MOM in the 1970s.

The other API that was created then was ISAM, which is how databases used to work. The first database I worked on was Informix SQL. Any Informix people in the room? Other people were doing Oracle. Somehow I ended up at a place that was doing Informix. Informix was obviously superior. An Informix SQL database was just a bunch of C-ISAM files. We used a library called C-ISAM to talk to the ISAM files. Then Informix invented ESQL/C, where you would put dollar symbols into your C code and write SQL statements in it. Then it would go through a pre-compiler that just turned those into the C-ISAM API calls. Then it would compile and link. We had a table tennis table, because compiling and linking in those days used to take three hours. If you broke the build, you had to stay and fix it for the next morning. If nobody broke the build, then whoever lost the table tennis tournament had to stay and make sure everything was ready for the next morning.

#api #qcon london 2020 #transcripts #programming #qcon #history #conferences #architecture #infoq #development #architecture & design #presentation

A Brief History of the Future of the API
1.25 GEEK