We live in a world that moves fast. Compared to the mid 90s through early 2000s, we have incredibly intelligent technology. Effectively, we have super-computers in our pockets. Our actual, modern supercomputers would have seemed like works of fiction just two decades ago. Not only is our ability to compute fast, but so is our data - with cellular 4G averaging 18.1 Mbps and 5G coming in at an average of 111.8 Mbps, at the time of writing this.

With all this speed, there has been abstraction after abstraction placed on top of our data and connections to make development easier, but there is a cost to that ease. We send a lot of data over the wire. For all of the assets on a webpage, websites targeting desktops send almost 2.1Mb of data, while mobile sends nearly 1.9Mb. These speeds were not always this ubiquitous. Perhaps it’s time to look back at how applications were able to provide relatively fast interaction with data in a world long forgotten: The world of 56Kbps.

My goal is that, by the end of this article, you have a stronger understanding of some of the low level APIs that make moving large amounts of data possible. Additionally, we’re going to use this knowledge to make a secure streaming service from Amazon Web Services S3 that will let us restrict data to certain roles. We’ll do this using the smallest instance available on AWS, demonstrating the power of streams to move big data with small machines.

To follow along with this guide you’ll need these prerequisites:

  • Basic knowledge of .NET
  • Visual Studio Code or Visual Studio 2019
  • An Okta Developer Account (free forever, to handle your OAuth needs)
  • An AWS account (we’ll be using a free tier product)
  • AWS Toolkit for Visual Studio

#c #c# #c++ #programming-c

How to Master the Filestream in C#
1.40 GEEK