Cody  Osinski

Cody Osinski

1624447140

Azure high-performance computing at Supercomputing 2020

Customers around the world rely on Microsoft Azure to drive innovations related to our environment, public health, energy sustainability, weather modeling, economic growth, and more. Finding solutions to these important challenges requires huge amounts of focused computing power. Customers are increasingly finding the best way to access such high-performance computing (HPC) is through the agility, scale, security, and leading edge performance of Azure’s purpose-built HPC and AI cloud services.

Azure’s market-leading vision for HPC and AI is based on a core of genuine and recognized HPC expertise, using proven HPC technology and design principles, enhanced with the best features of the cloud. The result is a capability that delivers performance, scale, and value unlike any other cloud. This means applications scaling 12 times higher than other public clouds. It means higher application performance per node. It means powering AI workloads for one customer with a supercomputer fit to be among the top five in the world. And it means delivering massive compute power into the hands of medical researchers over a weekend to prove out life-saving innovations in the fight against COVID-19.

Big moments for Azure HPC and AI Supercomputing in 2020

OpenAI

partnership with OpenAI. Earlier this year, OpenAI released their first application programming interface (AP)I for their general-purpose language model, Generative Pre-trained Transformer 3 (GPT-3), giving developers access to advanced natural language processing (NLP) technologies for building more interactive applications and services. Shortly thereafter, OpenAI announced it has agreed to license its GPT-3 technology to Microsoft to use within its own products and services.

This will continue to offer developers and scientists new means to not just transform their organizations but to do so faster and with less disruption. It was also announced that Microsoft has deployed dedicated cloud supercomputing capability for OpenAI, comparable to the top five supercomputers in the world.

#announcements #azure #supercomputing 2020

What is GEEK

Buddha Community

Azure high-performance computing at Supercomputing 2020
Brain  Crist

Brain Crist

1594753020

Citrix Bugs Allow Unauthenticated Code Injection, Data Theft

Multiple vulnerabilities in the Citrix Application Delivery Controller (ADC) and Gateway would allow code injection, information disclosure and denial of service, the networking vendor announced Tuesday. Four of the bugs are exploitable by an unauthenticated, remote attacker.

The Citrix products (formerly known as NetScaler ADC and Gateway) are used for application-aware traffic management and secure remote access, respectively, and are installed in at least 80,000 companies in 158 countries, according to a December assessment from Positive Technologies.

Other flaws announced Tuesday also affect Citrix SD-WAN WANOP appliances, models 4000-WO, 4100-WO, 5000-WO and 5100-WO.

Attacks on the management interface of the products could result in system compromise by an unauthenticated user on the management network; or system compromise through cross-site scripting (XSS). Attackers could also create a download link for the device which, if downloaded and then executed by an unauthenticated user on the management network, could result in the compromise of a local computer.

“Customers who have configured their systems in accordance with Citrix recommendations [i.e., to have this interface separated from the network and protected by a firewall] have significantly reduced their risk from attacks to the management interface,” according to the vendor.

Threat actors could also mount attacks on Virtual IPs (VIPs). VIPs, among other things, are used to provide users with a unique IP address for communicating with network resources for applications that do not allow multiple connections or users from the same IP address.

The VIP attacks include denial of service against either the Gateway or Authentication virtual servers by an unauthenticated user; or remote port scanning of the internal network by an authenticated Citrix Gateway user.

“Attackers can only discern whether a TLS connection is possible with the port and cannot communicate further with the end devices,” according to the critical Citrix advisory. “Customers who have not enabled either the Gateway or Authentication virtual servers are not at risk from attacks that are applicable to those servers. Other virtual servers e.g. load balancing and content switching virtual servers are not affected by these issues.”

A final vulnerability has been found in Citrix Gateway Plug-in for Linux that would allow a local logged-on user of a Linux system with that plug-in installed to elevate their privileges to an administrator account on that computer, the company said.

#vulnerabilities #adc #citrix #code injection #critical advisory #cve-2020-8187 #cve-2020-8190 #cve-2020-8191 #cve-2020-8193 #cve-2020-8194 #cve-2020-8195 #cve-2020-8196 #cve-2020-8197 #cve-2020-8198 #cve-2020-8199 #denial of service #gateway #information disclosure #patches #security advisory #security bugs

Cody  Osinski

Cody Osinski

1624447140

Azure high-performance computing at Supercomputing 2020

Customers around the world rely on Microsoft Azure to drive innovations related to our environment, public health, energy sustainability, weather modeling, economic growth, and more. Finding solutions to these important challenges requires huge amounts of focused computing power. Customers are increasingly finding the best way to access such high-performance computing (HPC) is through the agility, scale, security, and leading edge performance of Azure’s purpose-built HPC and AI cloud services.

Azure’s market-leading vision for HPC and AI is based on a core of genuine and recognized HPC expertise, using proven HPC technology and design principles, enhanced with the best features of the cloud. The result is a capability that delivers performance, scale, and value unlike any other cloud. This means applications scaling 12 times higher than other public clouds. It means higher application performance per node. It means powering AI workloads for one customer with a supercomputer fit to be among the top five in the world. And it means delivering massive compute power into the hands of medical researchers over a weekend to prove out life-saving innovations in the fight against COVID-19.

Big moments for Azure HPC and AI Supercomputing in 2020

OpenAI

partnership with OpenAI. Earlier this year, OpenAI released their first application programming interface (AP)I for their general-purpose language model, Generative Pre-trained Transformer 3 (GPT-3), giving developers access to advanced natural language processing (NLP) technologies for building more interactive applications and services. Shortly thereafter, OpenAI announced it has agreed to license its GPT-3 technology to Microsoft to use within its own products and services.

This will continue to offer developers and scientists new means to not just transform their organizations but to do so faster and with less disruption. It was also announced that Microsoft has deployed dedicated cloud supercomputing capability for OpenAI, comparable to the top five supercomputers in the world.

#announcements #azure #supercomputing 2020

Eric  Bukenya

Eric Bukenya

1624713540

Learn NoSQL in Azure: Diving Deeper into Azure Cosmos DB

This article is a part of the series – Learn NoSQL in Azure where we explore Azure Cosmos DB as a part of the non-relational database system used widely for a variety of applications. Azure Cosmos DB is a part of Microsoft’s serverless databases on Azure which is highly scalable and distributed across all locations that run on Azure. It is offered as a platform as a service (PAAS) from Azure and you can develop databases that have a very high throughput and very low latency. Using Azure Cosmos DB, customers can replicate their data across multiple locations across the globe and also across multiple locations within the same region. This makes Cosmos DB a highly available database service with almost 99.999% availability for reads and writes for multi-region modes and almost 99.99% availability for single-region modes.

In this article, we will focus more on how Azure Cosmos DB works behind the scenes and how can you get started with it using the Azure Portal. We will also explore how Cosmos DB is priced and understand the pricing model in detail.

How Azure Cosmos DB works

As already mentioned, Azure Cosmos DB is a multi-modal NoSQL database service that is geographically distributed across multiple Azure locations. This helps customers to deploy the databases across multiple locations around the globe. This is beneficial as it helps to reduce the read latency when the users use the application.

As you can see in the figure above, Azure Cosmos DB is distributed across the globe. Let’s suppose you have a web application that is hosted in India. In that case, the NoSQL database in India will be considered as the master database for writes and all the other databases can be considered as a read replicas. Whenever new data is generated, it is written to the database in India first and then it is synchronized with the other databases.

Consistency Levels

While maintaining data over multiple regions, the most common challenge is the latency as when the data is made available to the other databases. For example, when data is written to the database in India, users from India will be able to see that data sooner than users from the US. This is due to the latency in synchronization between the two regions. In order to overcome this, there are a few modes that customers can choose from and define how often or how soon they want their data to be made available in the other regions. Azure Cosmos DB offers five levels of consistency which are as follows:

  • Strong
  • Bounded staleness
  • Session
  • Consistent prefix
  • Eventual

In most common NoSQL databases, there are only two levels – Strong and EventualStrong being the most consistent level while Eventual is the least. However, as we move from Strong to Eventual, consistency decreases but availability and throughput increase. This is a trade-off that customers need to decide based on the criticality of their applications. If you want to read in more detail about the consistency levels, the official guide from Microsoft is the easiest to understand. You can refer to it here.

Azure Cosmos DB Pricing Model

Now that we have some idea about working with the NoSQL database – Azure Cosmos DB on Azure, let us try to understand how the database is priced. In order to work with any cloud-based services, it is essential that you have a sound knowledge of how the services are charged, otherwise, you might end up paying something much higher than your expectations.

If you browse to the pricing page of Azure Cosmos DB, you can see that there are two modes in which the database services are billed.

  • Database Operations – Whenever you execute or run queries against your NoSQL database, there are some resources being used. Azure terms these usages in terms of Request Units or RU. The amount of RU consumed per second is aggregated and billed
  • Consumed Storage – As you start storing data in your database, it will take up some space in order to store that data. This storage is billed per the standard SSD-based storage across any Azure locations globally

Let’s learn about this in more detail.

#azure #azure cosmos db #nosql #azure #nosql in azure #azure cosmos db

Ruthie  Bugala

Ruthie Bugala

1620435660

How to set up Azure Data Sync between Azure SQL databases and on-premises SQL Server

In this article, you learn how to set up Azure Data Sync services. In addition, you will also learn how to create and set up a data sync group between Azure SQL database and on-premises SQL Server.

In this article, you will see:

  • Overview of Azure SQL Data Sync feature
  • Discuss key components
  • Comparison between Azure SQL Data sync with the other Azure Data option
  • Setup Azure SQL Data Sync
  • More…

Azure Data Sync

Azure Data Sync —a synchronization service set up on an Azure SQL Database. This service synchronizes the data across multiple SQL databases. You can set up bi-directional data synchronization where data ingest and egest process happens between the SQL databases—It can be between Azure SQL database and on-premises and/or within the cloud Azure SQL database. At this moment, the only limitation is that it will not support Azure SQL Managed Instance.

#azure #sql azure #azure sql #azure data sync #azure sql #sql server

Ron  Cartwright

Ron Cartwright

1600624800

Getting Started With Azure Event Grid Viewer

In the last article, we had a look at how to start with Azure DevOps: Getting Started With Audit Streaming With Event Grid

In the article, we will go to the next step to create a subscription and use webhook event handlers to view those logs in our Azure web application.

#cloud #tutorial #azure #event driven architecture #realtime #signalr #webhook #azure web services #azure event grid #azure #azure event grid #serverless architecture #application integration