A definitive guide to turn CSV files into Power BI visuals using Azure

Cloud, Big Data, and Business Intelligence are the three buzz words of the decade. Everyone is talking about them. Everyone wants to do it. But no one tells you how to do it. How do you use the cloud to process your big data and build _intelligence _around it to make _business _decisions? There are multiple answers to that question, and in this guide, we tried to answer that question using Microsoft’s cloud solution (Azure) and Microsoft’s BI tool (Power BI) to get you started in the right direction.

Microsoft Azure is one of the leading providers of cloud solutions, offering an end-to-end set of tools and techniques to ingest, analyze, and consume vast sources and formats of data.

Disclaimer and Terms of use

Please read our terms of use before proceeding with this article.

Caution

Microsoft Azure is a paid service, and following this article can cause financial liability to you or your organization.

Prerequisites

An active Microsoft Azure subscription

How to get an Azure subscription?

The very first step in the journey towards representing data in an easy, digestive, and usable form is recognizing the source and format of the data. Traditionally data professionals used to focus on Extract, Transform & Load (ETL) to load and transform data. The advent of Azure has opened the doorway to process structure-less data on an unlimited and unprecedented scale. This change has shifted the transformation and loading data to Extract, Load & Transform (ELT). The basic principles and steps remain the same; they just follow a different order. A data project in Azure typically involves the following steps:

  1. Ingest: Identify the tools, technologies, and method to load the data
  2. Prep and train: Identify the tools, technologies, and method to transform the data

Followed by two additional steps to analyze and consume the cleansed data

  • Model and serve: Identify the tools and methods to model and analyze the data
  • Consume: Identify the tools and techniques to consume or present the data

In this article, we look at a holistic approach of transforming and loading data in Azure as we move through different phases of ELT until consumption with the aid of an example. To begin our journey, we will:

  • Take publicly available COVID-19 data from GitHub (source)
  • Store the CSV files to Azure Data Lake Storage Gen2 with the help of Azure Data Factory (ingest)
  • Transform and cleanse the CSV files to relational data in Azure Databricks (prep and train)
  • Store the cleansed data in Azure Synapse Analytics data warehouse (model and serve)
  • And finally, present the prepared data in the form of Power BI visuals (consume)

#data-science

What is GEEK

Buddha Community

A definitive guide to turn CSV files into Power BI visuals using Azure
Chloe  Butler

Chloe Butler

1667425440

Pdf2gerb: Perl Script Converts PDF Files to Gerber format

pdf2gerb

Perl script converts PDF files to Gerber format

Pdf2Gerb generates Gerber 274X photoplotting and Excellon drill files from PDFs of a PCB. Up to three PDFs are used: the top copper layer, the bottom copper layer (for 2-sided PCBs), and an optional silk screen layer. The PDFs can be created directly from any PDF drawing software, or a PDF print driver can be used to capture the Print output if the drawing software does not directly support output to PDF.

The general workflow is as follows:

  1. Design the PCB using your favorite CAD or drawing software.
  2. Print the top and bottom copper and top silk screen layers to a PDF file.
  3. Run Pdf2Gerb on the PDFs to create Gerber and Excellon files.
  4. Use a Gerber viewer to double-check the output against the original PCB design.
  5. Make adjustments as needed.
  6. Submit the files to a PCB manufacturer.

Please note that Pdf2Gerb does NOT perform DRC (Design Rule Checks), as these will vary according to individual PCB manufacturer conventions and capabilities. Also note that Pdf2Gerb is not perfect, so the output files must always be checked before submitting them. As of version 1.6, Pdf2Gerb supports most PCB elements, such as round and square pads, round holes, traces, SMD pads, ground planes, no-fill areas, and panelization. However, because it interprets the graphical output of a Print function, there are limitations in what it can recognize (or there may be bugs).

See docs/Pdf2Gerb.pdf for install/setup, config, usage, and other info.


pdf2gerb_cfg.pm

#Pdf2Gerb config settings:
#Put this file in same folder/directory as pdf2gerb.pl itself (global settings),
#or copy to another folder/directory with PDFs if you want PCB-specific settings.
#There is only one user of this file, so we don't need a custom package or namespace.
#NOTE: all constants defined in here will be added to main namespace.
#package pdf2gerb_cfg;

use strict; #trap undef vars (easier debug)
use warnings; #other useful info (easier debug)


##############################################################################################
#configurable settings:
#change values here instead of in main pfg2gerb.pl file

use constant WANT_COLORS => ($^O !~ m/Win/); #ANSI colors no worky on Windows? this must be set < first DebugPrint() call

#just a little warning; set realistic expectations:
#DebugPrint("${\(CYAN)}Pdf2Gerb.pl ${\(VERSION)}, $^O O/S\n${\(YELLOW)}${\(BOLD)}${\(ITALIC)}This is EXPERIMENTAL software.  \nGerber files MAY CONTAIN ERRORS.  Please CHECK them before fabrication!${\(RESET)}", 0); #if WANT_DEBUG

use constant METRIC => FALSE; #set to TRUE for metric units (only affect final numbers in output files, not internal arithmetic)
use constant APERTURE_LIMIT => 0; #34; #max #apertures to use; generate warnings if too many apertures are used (0 to not check)
use constant DRILL_FMT => '2.4'; #'2.3'; #'2.4' is the default for PCB fab; change to '2.3' for CNC

use constant WANT_DEBUG => 0; #10; #level of debug wanted; higher == more, lower == less, 0 == none
use constant GERBER_DEBUG => 0; #level of debug to include in Gerber file; DON'T USE FOR FABRICATION
use constant WANT_STREAMS => FALSE; #TRUE; #save decompressed streams to files (for debug)
use constant WANT_ALLINPUT => FALSE; #TRUE; #save entire input stream (for debug ONLY)

#DebugPrint(sprintf("${\(CYAN)}DEBUG: stdout %d, gerber %d, want streams? %d, all input? %d, O/S: $^O, Perl: $]${\(RESET)}\n", WANT_DEBUG, GERBER_DEBUG, WANT_STREAMS, WANT_ALLINPUT), 1);
#DebugPrint(sprintf("max int = %d, min int = %d\n", MAXINT, MININT), 1); 

#define standard trace and pad sizes to reduce scaling or PDF rendering errors:
#This avoids weird aperture settings and replaces them with more standardized values.
#(I'm not sure how photoplotters handle strange sizes).
#Fewer choices here gives more accurate mapping in the final Gerber files.
#units are in inches
use constant TOOL_SIZES => #add more as desired
(
#round or square pads (> 0) and drills (< 0):
    .010, -.001,  #tiny pads for SMD; dummy drill size (too small for practical use, but needed so StandardTool will use this entry)
    .031, -.014,  #used for vias
    .041, -.020,  #smallest non-filled plated hole
    .051, -.025,
    .056, -.029,  #useful for IC pins
    .070, -.033,
    .075, -.040,  #heavier leads
#    .090, -.043,  #NOTE: 600 dpi is not high enough resolution to reliably distinguish between .043" and .046", so choose 1 of the 2 here
    .100, -.046,
    .115, -.052,
    .130, -.061,
    .140, -.067,
    .150, -.079,
    .175, -.088,
    .190, -.093,
    .200, -.100,
    .220, -.110,
    .160, -.125,  #useful for mounting holes
#some additional pad sizes without holes (repeat a previous hole size if you just want the pad size):
    .090, -.040,  #want a .090 pad option, but use dummy hole size
    .065, -.040, #.065 x .065 rect pad
    .035, -.040, #.035 x .065 rect pad
#traces:
    .001,  #too thin for real traces; use only for board outlines
    .006,  #minimum real trace width; mainly used for text
    .008,  #mainly used for mid-sized text, not traces
    .010,  #minimum recommended trace width for low-current signals
    .012,
    .015,  #moderate low-voltage current
    .020,  #heavier trace for power, ground (even if a lighter one is adequate)
    .025,
    .030,  #heavy-current traces; be careful with these ones!
    .040,
    .050,
    .060,
    .080,
    .100,
    .120,
);
#Areas larger than the values below will be filled with parallel lines:
#This cuts down on the number of aperture sizes used.
#Set to 0 to always use an aperture or drill, regardless of size.
use constant { MAX_APERTURE => max((TOOL_SIZES)) + .004, MAX_DRILL => -min((TOOL_SIZES)) + .004 }; #max aperture and drill sizes (plus a little tolerance)
#DebugPrint(sprintf("using %d standard tool sizes: %s, max aper %.3f, max drill %.3f\n", scalar((TOOL_SIZES)), join(", ", (TOOL_SIZES)), MAX_APERTURE, MAX_DRILL), 1);

#NOTE: Compare the PDF to the original CAD file to check the accuracy of the PDF rendering and parsing!
#for example, the CAD software I used generated the following circles for holes:
#CAD hole size:   parsed PDF diameter:      error:
#  .014                .016                +.002
#  .020                .02267              +.00267
#  .025                .026                +.001
#  .029                .03167              +.00267
#  .033                .036                +.003
#  .040                .04267              +.00267
#This was usually ~ .002" - .003" too big compared to the hole as displayed in the CAD software.
#To compensate for PDF rendering errors (either during CAD Print function or PDF parsing logic), adjust the values below as needed.
#units are pixels; for example, a value of 2.4 at 600 dpi = .0004 inch, 2 at 600 dpi = .0033"
use constant
{
    HOLE_ADJUST => -0.004 * 600, #-2.6, #holes seemed to be slightly oversized (by .002" - .004"), so shrink them a little
    RNDPAD_ADJUST => -0.003 * 600, #-2, #-2.4, #round pads seemed to be slightly oversized, so shrink them a little
    SQRPAD_ADJUST => +0.001 * 600, #+.5, #square pads are sometimes too small by .00067, so bump them up a little
    RECTPAD_ADJUST => 0, #(pixels) rectangular pads seem to be okay? (not tested much)
    TRACE_ADJUST => 0, #(pixels) traces seemed to be okay?
    REDUCE_TOLERANCE => .001, #(inches) allow this much variation when reducing circles and rects
};

#Also, my CAD's Print function or the PDF print driver I used was a little off for circles, so define some additional adjustment values here:
#Values are added to X/Y coordinates; units are pixels; for example, a value of 1 at 600 dpi would be ~= .002 inch
use constant
{
    CIRCLE_ADJUST_MINX => 0,
    CIRCLE_ADJUST_MINY => -0.001 * 600, #-1, #circles were a little too high, so nudge them a little lower
    CIRCLE_ADJUST_MAXX => +0.001 * 600, #+1, #circles were a little too far to the left, so nudge them a little to the right
    CIRCLE_ADJUST_MAXY => 0,
    SUBST_CIRCLE_CLIPRECT => FALSE, #generate circle and substitute for clip rects (to compensate for the way some CAD software draws circles)
    WANT_CLIPRECT => TRUE, #FALSE, #AI doesn't need clip rect at all? should be on normally?
    RECT_COMPLETION => FALSE, #TRUE, #fill in 4th side of rect when 3 sides found
};

#allow .012 clearance around pads for solder mask:
#This value effectively adjusts pad sizes in the TOOL_SIZES list above (only for solder mask layers).
use constant SOLDER_MARGIN => +.012; #units are inches

#line join/cap styles:
use constant
{
    CAP_NONE => 0, #butt (none); line is exact length
    CAP_ROUND => 1, #round cap/join; line overhangs by a semi-circle at either end
    CAP_SQUARE => 2, #square cap/join; line overhangs by a half square on either end
    CAP_OVERRIDE => FALSE, #cap style overrides drawing logic
};
    
#number of elements in each shape type:
use constant
{
    RECT_SHAPELEN => 6, #x0, y0, x1, y1, count, "rect" (start, end corners)
    LINE_SHAPELEN => 6, #x0, y0, x1, y1, count, "line" (line seg)
    CURVE_SHAPELEN => 10, #xstart, ystart, x0, y0, x1, y1, xend, yend, count, "curve" (bezier 2 points)
    CIRCLE_SHAPELEN => 5, #x, y, 5, count, "circle" (center + radius)
};
#const my %SHAPELEN =
#Readonly my %SHAPELEN =>
our %SHAPELEN =
(
    rect => RECT_SHAPELEN,
    line => LINE_SHAPELEN,
    curve => CURVE_SHAPELEN,
    circle => CIRCLE_SHAPELEN,
);

#panelization:
#This will repeat the entire body the number of times indicated along the X or Y axes (files grow accordingly).
#Display elements that overhang PCB boundary can be squashed or left as-is (typically text or other silk screen markings).
#Set "overhangs" TRUE to allow overhangs, FALSE to truncate them.
#xpad and ypad allow margins to be added around outer edge of panelized PCB.
use constant PANELIZE => {'x' => 1, 'y' => 1, 'xpad' => 0, 'ypad' => 0, 'overhangs' => TRUE}; #number of times to repeat in X and Y directions

# Set this to 1 if you need TurboCAD support.
#$turboCAD = FALSE; #is this still needed as an option?

#CIRCAD pad generation uses an appropriate aperture, then moves it (stroke) "a little" - we use this to find pads and distinguish them from PCB holes. 
use constant PAD_STROKE => 0.3; #0.0005 * 600; #units are pixels
#convert very short traces to pads or holes:
use constant TRACE_MINLEN => .001; #units are inches
#use constant ALWAYS_XY => TRUE; #FALSE; #force XY even if X or Y doesn't change; NOTE: needs to be TRUE for all pads to show in FlatCAM and ViewPlot
use constant REMOVE_POLARITY => FALSE; #TRUE; #set to remove subtractive (negative) polarity; NOTE: must be FALSE for ground planes

#PDF uses "points", each point = 1/72 inch
#combined with a PDF scale factor of .12, this gives 600 dpi resolution (1/72 * .12 = 600 dpi)
use constant INCHES_PER_POINT => 1/72; #0.0138888889; #multiply point-size by this to get inches

# The precision used when computing a bezier curve. Higher numbers are more precise but slower (and generate larger files).
#$bezierPrecision = 100;
use constant BEZIER_PRECISION => 36; #100; #use const; reduced for faster rendering (mainly used for silk screen and thermal pads)

# Ground planes and silk screen or larger copper rectangles or circles are filled line-by-line using this resolution.
use constant FILL_WIDTH => .01; #fill at most 0.01 inch at a time

# The max number of characters to read into memory
use constant MAX_BYTES => 10 * M; #bumped up to 10 MB, use const

use constant DUP_DRILL1 => TRUE; #FALSE; #kludge: ViewPlot doesn't load drill files that are too small so duplicate first tool

my $runtime = time(); #Time::HiRes::gettimeofday(); #measure my execution time

print STDERR "Loaded config settings from '${\(__FILE__)}'.\n";
1; #last value must be truthful to indicate successful load


#############################################################################################
#junk/experiment:

#use Package::Constants;
#use Exporter qw(import); #https://perldoc.perl.org/Exporter.html

#my $caller = "pdf2gerb::";

#sub cfg
#{
#    my $proto = shift;
#    my $class = ref($proto) || $proto;
#    my $settings =
#    {
#        $WANT_DEBUG => 990, #10; #level of debug wanted; higher == more, lower == less, 0 == none
#    };
#    bless($settings, $class);
#    return $settings;
#}

#use constant HELLO => "hi there2"; #"main::HELLO" => "hi there";
#use constant GOODBYE => 14; #"main::GOODBYE" => 12;

#print STDERR "read cfg file\n";

#our @EXPORT_OK = Package::Constants->list(__PACKAGE__); #https://www.perlmonks.org/?node_id=1072691; NOTE: "_OK" skips short/common names

#print STDERR scalar(@EXPORT_OK) . " consts exported:\n";
#foreach(@EXPORT_OK) { print STDERR "$_\n"; }
#my $val = main::thing("xyz");
#print STDERR "caller gave me $val\n";
#foreach my $arg (@ARGV) { print STDERR "arg $arg\n"; }

Download Details:

Author: swannman
Source Code: https://github.com/swannman/pdf2gerb

License: GPL-3.0 license

#perl 

sophia tondon

sophia tondon

1620885491

Microsoft Power BI Consulting | Power BI Solutions in India

Hire top dedicated Mirosoft power BI consultants from ValueCoders who aim at leveraging their potential to address organizational challenges for large-scale data storage and seamless processing.

We have a team of dedicated power BI consultants who help start-ups, SMEs, and enterprises to analyse business data and get useful insights.

What are you waiting for? Contact us now!

No Freelancers, 100% Own Staff
Experienced Consultants
Continuous Monitoring
Lean Processes, Agile Mindset
Non-Disclosure Agreement
Up To 2X Less Time

##power bi service #power bi consultant #power bi consultants #power bi consulting #power bi developer #power bi development

sophia tondon

sophia tondon

1619670565

Hire Power BI Developer | Microsoft Power BI consultants in India

Hire our expert Power BI consultants to make the most out of your business data. Our power bi developers have deep knowledge in Microsoft Power BI data modeling, structuring, and analysis. 16+ Yrs exp | 2500+ Clients| 450+ Team

Visit Website - https://www.valuecoders.com/hire-developers/hire-power-bi-developer-consultants

#power bi service #power bi consultant #power bi consultants #power bi consulting #power bi developer #power bi consulting services

Is Power BI Actually Useful?

The short answer, for most of you, is no. However, the complexity and capability of the products could be beneficial depending on what type of position or organization you work in.
This is image title
In my effort to answer this common question about Power BI I researched the following:
– Power BI Desktop Gateway
– Syncing on-prem SQL server data
– Syncing SharePoint Online list data
– Syncing data from an Excel workbook
– Building, and sharing a dashboard
– Inserting a Power BI visualization into PowerPoint

To get in-Depth knowledge on Power BI you can enroll for a live demo on Power BI online training

The feature spread above gave me the opportunity to explore the main features of Power BI which break down as:
– Ingesting data, building a data set
– Creating dashboard or reports with visualizations based on that data

In a nutshell Power BI is a simple concept. You take a data set, and build visualizations that answer questions about that data. For example, how many products have we sold in Category A in the last month? Quarter? Year? Power BI is especially powerful when drilling up or down in time scale.
And there are some interesting ways to visualize that data:
However, there are a number of drawbacks to the current product that prevented me from being able to fold these visualizations into our existing business processes.

  1. Integration with PowerPoint is not free. This shocked me.

The most inspiring Power BI demo I saw at a Microsoft event showed a beautiful globe visualization within a PowerPoint presentation. It rendered flawlessly within PowerPoint and was a beautiful, interactive way to explore a geographically disparate data set. I was able to derive conclusions about the sales data displayed without having to look at an old, boring chart.

During the demo, nothing was mentioned about the technology required to make this embedded chart a reality. After looking into the PowerPoint integration I learned that not only was the add-in built by a third party, it was not free, and when I signed up for a free trial the add-in could barely render my Power BI visualization. The data drill up/down functionality was non-existent and not all of the visualizations were supported. Learn more from Power bi online course

  1. Only Dashboards can be shared with other users, and cannot be embedded in our organization’s community on SharePoint.

Folks in our organization spent 50% of their time in Outlook, and the rest in SharePoint, OneNote, Excel, Word, and the other applications needed for producing documents, and other work. Adding yet another destination to that list to check on how something is doing was impossible for us. Habits are extremely hard to change, and I see that consistently in our client’s organizations as well.

Because I was not able to fold in the visualizations with the PowerPoint decks we use during meetings, I had to stop presentations in the middle, navigate to Internet Explorer (because the visualizations only render well in that browser), and then go back to PowerPoint once we were done looking at the dashboard.

This broke up the flow of our meetings, and led to more distractions. I also followed up with coworkers after meetings to see if they ever visited the dashboard themselves at their desk. None of them had ever navigated to a dashboard outside of a meeting.

  1. The visualizations aren’t actually that great.

Creating visualizations that cover such a wide variety of data sets is difficult. But, the Excel team has been working on this problem for over 15 years. When I import my SharePoint or SQL data to Excel I’m able to create extremely customized Pivot Tables and Charts that show precisely the data I need to see.

I was never able to replicate visualizations from Excel in Power BI, to produce the types of visualizations I actually needed. Excel has the ability to do conditional formatting, and other customizations in charts and tables that is simply not possible with Power BI. Because of how generic the charts are, and the limited customization it looks “cool” without being functional.

In conclusion, if you have spare time and want to explore Power BI for your organization you should. However, if you are seriously thinking about how you can fold this product into your work processes, challenge yourself to build a dashboard and look at it once a week. See if you can keep that up for a month, and then think about how that change affected your work habits and whether the data analysis actually contributed value each time. At least half of you will realize that this gimmicky product is fancy, but not actually useful.

Take your career to new heights of success with Power BI online training Hyderabad

#power bi training #power bi course #learn power bi #power bi online training #microsoft power bi training #power bi online course

Tableau vs Power BI: Comparing the Data Visualization Tools

In analytics, Tableau is the leading visualisation tool. Its rich analytical features and attention to data details are the reason behind its popularity. Power BI, on the other hand, is preferred by professionals who are more comfortable with Microsoft Office365. The users can connect Excel queries, data models and are able to report to the dashboard.
This is image title

While the usage of both these tools might depend on many factors, here is a quick comparison of the two popular tools on various functionalities.

To get in-Depth knowledge on Power BI you can enroll for a live demo on Power BI online training

1. Performance
One of the crucial differences between Tableau and Power BI is that Tableau is an extensible platform which not only provides visualisations but also helps in gaining a better understanding of the data. Both the tools are excellent in visualisation but when it comes to the depth of data, Tableau helps an analyst to dive deeper into the data by performing “what-if” analysis on the data.

2. Flexibility
A user can deploy tableau on-premises, on the public cloud on Microsoft Azure, Amazon Web Services, or Google Cloud Platform, or on Tableau Online. While PowerBI, an upgradation of Microsoft Excel can be said as not so flexible as it serves only as a software-as-a-service model.
Power BI vs Tableau: A Data Analytics Duel - TechnologyAdvice

3. User Interface
Tableau is mainly designed keeping data analysts in mind. The richer analytical capabilities for visualisation helps an analyst gain insight into large datasets. It allows the user to create customised dashboards which can be considered as more of a pro-level. Microsoft PowerBI is simpler than Tableau and offers a better intuitive interface, especially for the beginners. This tool can be used by a coder as well as a non-coder.

4. Visualisation
Power BI focuses on data modelling and offers features of data manipulation and then provides data visualisation while tableau strictly focuses on data visualisation. Learn more from Power BI online course

5. AI-Powered
PowerBI tool with its Microsoft Flow and its AI builder tool can help in building apps with a layer of intelligence. With the advantages of Microsoft AI, the user can prepare data, build machine learning models and gain insights from both structured and unstructured data. On the hand, Tableau is working on natural language capabilities to simplify analytics and help the users who have no prior data analysis experience, known as Ask Data. Recently, Tableau has also announced the beta version of Explain Data, a new AI-powered feature to help users understand the “why” behind unexpected values in their data.

6. Price
PowerBI offers two subscription offerings, Power BI Pro and Power BI Premium. Power BI Pro is priced at .99 per user per month which is a self-service BI where user can collaborate, publish, share and perform ad-hoc analysis. Whereas Power BI Premium is priced at ,995 per month per dedicated cloud compute and storage resource. Here, the user can perform big data analytics, advanced administration and more.

Tableau, on the other hand, offers three subscriptions — Tableau Creator, Explorer, and Viewer. Tableau Creator is priced at user per month and it includes Tableau Desktop, Prep Builder and one creator license of Tableau Server. Tableau Explorer is priced at per user per month where they can explore the trusted data with self-service analytics and the Tableau Viewer is priced at user per month and here the user can view and interact with dashboards and visualisations in a secured way. These pricings are for teams and organisations where multiple users/viewing is required.
Take your career to new heights of success with Power BI online training Hyderabad

#power bi certification #power bi training #power bi online training hyderabad #power bi course #power bi online training #power bi online training india