1626791931
#githubcopilot #github #vscode #visual-studio #vscodeextension #openai
1626791931
#githubcopilot #github #vscode #visual-studio #vscodeextension #openai
1626140129
In this video, I’m going to show you how you can enroll in the preview program for Github Copilot and I will walk you through some of the examples that came to mind while trying this new feature.
Here’s some useful links:
Github Copilot Official Website:
https://copilot.github.com/
Github Copilot Extensions for VS Code:
https://marketplace.visualstudio.com/items?itemName=GitHub.copilot
Github Copilot on Github
https://github.com/github/copilot-preview/
What do you think about this new feature? Let us know in the comment section and don’t forget to like and subscribe!
#github #copilot #ai #artificial-intelligence #developer
1627569822
In this video, we will look at and try the GitHub Copilot AI pair programmer
Timestamps:
0:00 - Intro
2:28 - Specific Functions
4:52 - Generating Data
6:04 - Working with Data (Ordering, etc)
7:45 - Working With Fetch and API
12:04 - Async/Await Example
13:05 - Express Example
15:33 - React Example
16:57 - Final Thoughts
#github #copilot #artificialintelligence #ai
1599633600
Are you an Arctic Code Vault Contributor or have seen someone posting about it and don’t know what it is. So let’s take a look at what is an Arctic Code Vault Contributor and who are the ones who gets this batch.
GitHub, the world’s largest open-source platform for software and programs has safely locked the data of huge value and magnitude in a coal mine in Longyearbyen’s Norwegian town in the Arctic region.
Back in November 2019, GitHub Arctic Code Vault was first announced.
The GitHub Arctic Code Vault is a data repository preserved in the Arctic
World Archive (AWA), a very-long-term archival facility 250 meters deep in the permafrost of an Arctic mountain. The archive is located in a decommissioned coal mine in the Svalbard archipelago, closer to the North Pole than the Arctic Circle.
Last year, GitHub said that it plans to capture a snapshot of every active
public repository on 02/02/2020 and preserve that data in the Arctic
Code Vault.
The project began on February 2, when the firm took a snapshot of all of
GitHub’s active public repositories to store them in the vault. They initially intended to travel to Norway and personally escort the world’s open-source technology to the Arctic but their plans were derailed by the global pandemic. Then, they had to wait until 8 Julyfor the Arctic Data Vault data to be deposited.
GitHub announced that the code was successfully deposited in the Arctic Code Vault on July 8, 2020. Over the past several months, GitHub worked
with its archive partners Piql to write the 21TB of GitHub repository data to 186 reels of piqlFilm (digital photosensitive archival film).
GitHub’s strategic software director, Julia Metcalf, has written a blog post
on the company’s website notifying the completion of GitHub’s Archive Program on July 8th. Discussing the objective of the Archive Program, Metcalf wrote “Our mission is to preserve open-source software for future generations by storing your code in an archive built to last a thousand years.”
The Arctic Code Vault is only a small part of the wider GitHub Archive
Program, however, which sees the company partner with the Long Now
Foundation, Internet Archive, Software Heritage Foundation, Microsoft
Research and others.
Svalbard has been regulated by the international Svalbard Treaty as a demilitarized zone. Home to the world’s northernmost town, it is one of the most remote and geopolitically stable human habitations on Earth.
The AWA is a joint initiative between Norwegian state-owned mining company Store Norske Spitsbergen Kulkompani (SNSK) and very-long-term digital preservation provider Piql AS. AWA is devoted to archival storage in perpetuity. The film reels will be stored in a steel-walled container inside a sealed chamber within a decommissioned coal mine on the remote archipelago of Svalbard. The AWA already preserves historical and cultural data from Italy, Brazil, Norway, the Vatican, and many others.
The 02/02/2020 snapshot archived in the GitHub Arctic Code Vault will
sweep up every active public GitHub repository, in addition to significant dormant repos.
The snapshot will include every repo with any commits between the announcement at GitHub Universe on November 13th and 02/02/2020,
every repo with at least 1 star and any commits from the year before the snapshot (02/03/2019 – 02/02/2020), and every repo with at least 250 stars.
The snapshot will consist of the HEAD of the default branch of each repository, minus any binaries larger than 100KB in size—depending on available space, repos with more stars may retain binaries. Each repository will be packaged as a single TAR file. For greater data density and integrity, most of the data will be stored QR-encoded and compressed. A human-readable index and guide will itemize the location of each repository and explain how to recover the data.
The company further shared that every reel of the archive includes a copy
of the “Guide to the GitHub Code Vault” in five languages, written with input from GitHub’s community and available at the Archive Program’s own GitHub repository.
#github #open-source #coding #open-source-contribution #contributing-to-open-source #github-arctic-code-vault #arctic-code-vault #arctic-code-vault-contributor
1604048400
The story of Softagram is a long one and has many twists. Everything started in a small company long time ago, from the area of static analysis tools development. After many phases, Softagram is focusing on helping developers to get visual feedback on the code change: how is the software design evolving in the pull request under review.
While it is trivial to write 20 KLOC apps without help of tooling, usually things start getting complicated when the system grows over 100 KLOC.
The risk of god class anti-pattern, and the risk of mixing up with the responsibilities are increasing exponentially while the software grows larger.
To help with that, software evolution can be tracked safely with explicit dependency change reports provided automatically to each pull request. Blocking bad PR becomes easy, and having visual reports also has a democratizing effect on code review.
Architectural analysis of the code, identifying how delta is impacting to the code base. Language specific analyzers are able to extract the essential internal/external dependency structures from each of the mainstream programming languages.
Checking for rule violations or anomalies in the delta, e.g. finding out cyclical dependencies. Graph theory comes to big help when finding out unwanted or weird dependencies.
Building visualization for humans. Complex structures such as software is not easy to represent without help of graph visualization. Here comes the vital role of change graph visualization technology developed within the last few years.
#automated-code-review #code-review-automation #code-reviews #devsecops #software-development #code-review #coding #good-company