S3git-rb: Ruby Gem for S3git: Git For Cloud Storage


This is the Ruby interface for s3git. Please see here for a general introduction to s3git including its use cases.

This Ruby Gem is based on the s3git-go package that is invoked via foreign function interface (FFI) as described here.

DISCLAIMER: This software is still under development (although the storage format/model is stable) -- use at your own peril for now

Note that the API is not stable yet, you can expect minor changes/extensions


Please make sure you have a working Golang environment installed, otherwise the s3git Gem will not compile (ie. go build -buildmode=c-shared -o libs3git.so libs3git.go as in the \ext subdir). See install golang for setting up a working Golang environment.

Also the s3git-go package needs to be available locally (go get -d github.com/s3git/s3git-go).

Assuming you have rails installed, do as follows:

$ rails new s3git-test
$ cd s3git-test
$ # Add s3git to Gemfile
$ echo 'gem "s3git", :git => "git://github.com/s3git/s3git-rb.git"' >> Gemfile
$ bundle update

That's it -- you are now ready to do your first experiment as listed below.

Create a repository

$ irb
> require 'bundler/setup'
> require 's3git'
> S3git.init_repository '.'
> S3git.add 'hello s3git'
> S3git.commit 'My first s3git commit from Ruby'
> exit
# show commit history
$ s3git log --pretty
66aac2f7e3fe675215cd3cf491d5adb31d270c29dc9b40aa50c6c8606cf5eb784b250a4f181caccea393a34b2ba522f2a0678685014bc27caf987fc13c3bef76 My first s3git commit from Ruby
$ s3git ls
$ s3git cat c518dc5f1d95258dc91f6d285e7ea7300f37dea4dd517173f2e23afe0cb52bc9d8eb18683cdcf377e96a2d5a81585e61f6d27fa5d017cad53836bd050e9f105f
hello s3git

Clone a repository

> require 'bundler/setup'
> require 's3git'
> require 'tmpdir'
> S3git.clone 's3://s3git-spoon-knife', Dir.mktmpdir, {access_key: 'AKIAJYNT4FCBFWDQPERQ', secret_key: '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'}
> S3git.list('').each { |hash| puts hash } 

Dump contents of a repository

> %w(bundler/setup s3git tmpdir open-uri).each { |gem| require gem } 
> S3git.init_repository Dir.mktmpdir
> S3git.add 'hello s3git'
> S3git.add 'Ruby rocks'
> S3git.add open('https://github.com/s3git/s3git/blob/master/README.md')
> S3git.add open('local-file.txt')
> S3git.list('').each { |hash| puts S3git.get(hash).read } 

List multiple commits

> %w(bundler/setup s3git tmpdir).each { |gem| require gem } 
> S3git.init_repository Dir.mktmpdir
> S3git.add 'first file'
> S3git.commit 'first commit'
> S3git.add 'second file'
> S3git.commit 'second commit'
> S3git.list_commits.each { |c| puts c["Message"] } 

Create a snapshot

> %w(bundler/setup s3git tmpdir).each { |gem| require gem }
> dir = Dir.mktmpdir
> S3git.init_repository dir
> File.open(dir + '/file.txt', 'w') { |file| file.write("my snapshot") }
> S3git.snapshot_create 'Initial snapshot'
> S3git.snapshot_list ''

Limitations and Optimizations

  • Streams are not yet natively supported (temp files are used under the hood)
  • Methods that return arrays (eg. log or list) are currently limited to a maximum of 1000 responses
  • Proper error handling is largely missing


Contributions are welcome, please submit a pull request for any enhancements.

Download Details:

Author: s3git
Source Code: https://github.com/s3git/s3git-rb 
License: Apache-2.0 license

#go #golang #ruby 

What is GEEK

Buddha Community

S3git-rb: Ruby Gem for S3git: Git For Cloud Storage
Adaline  Kulas

Adaline Kulas


Multi-cloud Spending: 8 Tips To Lower Cost

A multi-cloud approach is nothing but leveraging two or more cloud platforms for meeting the various business requirements of an enterprise. The multi-cloud IT environment incorporates different clouds from multiple vendors and negates the dependence on a single public cloud service provider. Thus enterprises can choose specific services from multiple public clouds and reap the benefits of each.

Given its affordability and agility, most enterprises opt for a multi-cloud approach in cloud computing now. A 2018 survey on the public cloud services market points out that 81% of the respondents use services from two or more providers. Subsequently, the cloud computing services market has reported incredible growth in recent times. The worldwide public cloud services market is all set to reach $500 billion in the next four years, according to IDC.

By choosing multi-cloud solutions strategically, enterprises can optimize the benefits of cloud computing and aim for some key competitive advantages. They can avoid the lengthy and cumbersome processes involved in buying, installing and testing high-priced systems. The IaaS and PaaS solutions have become a windfall for the enterprise’s budget as it does not incur huge up-front capital expenditure.

However, cost optimization is still a challenge while facilitating a multi-cloud environment and a large number of enterprises end up overpaying with or without realizing it. The below-mentioned tips would help you ensure the money is spent wisely on cloud computing services.

  • Deactivate underused or unattached resources

Most organizations tend to get wrong with simple things which turn out to be the root cause for needless spending and resource wastage. The first step to cost optimization in your cloud strategy is to identify underutilized resources that you have been paying for.

Enterprises often continue to pay for resources that have been purchased earlier but are no longer useful. Identifying such unused and unattached resources and deactivating it on a regular basis brings you one step closer to cost optimization. If needed, you can deploy automated cloud management tools that are largely helpful in providing the analytics needed to optimize the cloud spending and cut costs on an ongoing basis.

  • Figure out idle instances

Another key cost optimization strategy is to identify the idle computing instances and consolidate them into fewer instances. An idle computing instance may require a CPU utilization level of 1-5%, but you may be billed by the service provider for 100% for the same instance.

Every enterprise will have such non-production instances that constitute unnecessary storage space and lead to overpaying. Re-evaluating your resource allocations regularly and removing unnecessary storage may help you save money significantly. Resource allocation is not only a matter of CPU and memory but also it is linked to the storage, network, and various other factors.

  • Deploy monitoring mechanisms

The key to efficient cost reduction in cloud computing technology lies in proactive monitoring. A comprehensive view of the cloud usage helps enterprises to monitor and minimize unnecessary spending. You can make use of various mechanisms for monitoring computing demand.

For instance, you can use a heatmap to understand the highs and lows in computing visually. This heat map indicates the start and stop times which in turn lead to reduced costs. You can also deploy automated tools that help organizations to schedule instances to start and stop. By following a heatmap, you can understand whether it is safe to shut down servers on holidays or weekends.

#cloud computing services #all #hybrid cloud #cloud #multi-cloud strategy #cloud spend #multi-cloud spending #multi cloud adoption #why multi cloud #multi cloud trends #multi cloud companies #multi cloud research #multi cloud market

Adaline  Kulas

Adaline Kulas


What are the benefits of cloud migration? Reasons you should migrate

The moving of applications, databases and other business elements from the local server to the cloud server called cloud migration. This article will deal with migration techniques, requirement and the benefits of cloud migration.

In simple terms, moving from local to the public cloud server is called cloud migration. Gartner says 17.5% revenue growth as promised in cloud migration and also has a forecast for 2022 as shown in the following image.

#cloud computing services #cloud migration #all #cloud #cloud migration strategy #enterprise cloud migration strategy #business benefits of cloud migration #key benefits of cloud migration #benefits of cloud migration #types of cloud migration

Google Cloud: Caching Cloud Storage content with Cloud CDN

In this Lab, we will configure Cloud Content Delivery Network (Cloud CDN) for a Cloud Storage bucket and verify caching of an image. Cloud CDN uses Google’s globally distributed edge points of presence to cache HTTP(S) load-balanced content close to our users. Caching content at the edges of Google’s network provides faster delivery of content to our users while reducing serving costs.

For an up-to-date list of Google’s Cloud CDN cache sites, see https://cloud.google.com/cdn/docs/locations.

Task 1. Create and populate a Cloud Storage bucket

Cloud CDN content can originate from different types of backends:

  • Compute Engine virtual machine (VM) instance groups
  • Zonal network endpoint groups (NEGs)
  • Internet network endpoint groups (NEGs), for endpoints that are outside of Google Cloud (also known as custom origins)
  • Google Cloud Storage buckets

In this lab, we will configure a Cloud Storage bucket as the backend.

#google-cloud #google-cloud-platform #cloud #cloud storage #cloud cdn

Cloud storage vs Traditional storage

With Information Technology becoming more and more Cloud based nowadays (due to industry demanding reliability and scalability in their infrastructure), the Cloud storage system has become a very feasible solution. Various organizations are migrating their data to cloud storage, due to a few simple reasons. They want data to be easily accessible, cost effective and reliable.

How is Cloud storage better than any traditional data storage

  • Performance: We are using NoSQL for Identity storage, NoSQL storage brings powerful read/write performance. We are maintaining low latency SSD for storage, this is why performance of NoSQL storage is continually progressing ahead of traditional HDD storage.
  • Maintenance: Doing everything in-house is not ideal for businesses especially when you are a start-up or small to mid-sized business. Maintaining in-house traditional databases is very painful, you lose focus from you main application/feature, this is why SaaS(Software as a Service) solutions are more feasible. They allow you to outsource this nasty upkeep to those who know best and allow you to focus on your strengths.

#cloud #cloud storage #traditional storage #cloud computing

Zelma  Gerlach

Zelma Gerlach


A python package to manage paths on Google Cloud Storage

Are you used to pathlib’s Path objects and frustrated when using GCSFileSystem objects ? The TransparentPath package is made for you.

What is it good for and how to use it

When I first started using Google Cloud Plateform (GCP), I faced the following difficulty : reading and writing files from/to Google Cloud Storage (GCS) easily in a Python code. For local files, I am used to the Pathlib library, that makes using paths really easy and intuitive by overloading truediv_ to allow bash-like paths creation, and by implementing a lot of useful methods like globlsunlink … that can be called on the path object directly. The class that allows one to use paths on GCS in Python is GCSFileSystem, from the package gcsfs, and it does not have all those handy features, for the main object will not be a file but a file system.

#python3 #google-cloud-platform #cloud-storage #google-cloud-storage #cloud