Arvel  Miller

Arvel Miller

1598776620

Introduction to Computational Gastronomy and Food Science

Application of data-driven strategies for investigating the gastronomic data has opened up exciting avenues, giving rise to an all-new field of ‘Computational Gastronomy’. This emerging interdisciplinary science asks questions of culinary origin to seek their answers via the compilation of culinary data and their analysis using methods of statistics, computer science, and artificial intelligence. Along with complementary experimental studies, these endeavors have the potential to transform the food landscape by effectively leveraging data-driven food innovations for better health and nutrition.

Topics covered in this talk include:

  1. Recipes and food pairing
  2. Taste, nutrition and health
  3. Novel recipe generation

#data science

What is GEEK

Buddha Community

Introduction to Computational Gastronomy and Food Science
Uriah  Dietrich

Uriah Dietrich

1618449987

How To Build A Data Science Career In 2021

For this week’s data science career interview, we got in touch with Dr Suman Sanyal, Associate Professor of Computer Science and Engineering at NIIT University. In this interview, Dr Sanyal shares his insights on how universities can contribute to this highly promising sector and what aspirants can do to build a successful data science career.

With industry-linkage, technology and research-driven seamless education, NIIT University has been recognised for addressing the growing demand for data science experts worldwide with its industry-ready courses. The university has recently introduced B.Tech in Data Science course, which aims to deploy data sets models to solve real-world problems. The programme provides industry-academic synergy for the students to establish careers in data science, artificial intelligence and machine learning.

“Students with skills that are aligned to new-age technology will be of huge value. The industry today wants young, ambitious students who have the know-how on how to get things done,” Sanyal said.

#careers # #data science aspirant #data science career #data science career intervie #data science education #data science education marke #data science jobs #niit university data science

77 Programming Language Q&A (P4)

The following article deals with questions/answers you may encounter when asked about Lexical and Syntax Analysis. Check the bottom of the page for links to the other questions and answers I’ve come up with to make you a great Computer Scientist (when it comes to Programming Languages).

324. What are the 3 approaches to implementing programming languages?

  • Compilation, Pure Interpretation and Hybrid Implementation

325. What is the job of the syntax analyzer?

  • Check the syntax of the program and create a parse tree.

326. What are the syntax analyzers based on?

  • Formal description of the syntax of programs, usually BNF.

327. What are some of the advantages of using BNF?

  • Descriptions are clear and concise.

  • Syntax analyzers can be generated directly from BNF.

  • Implementations based on BNF are easy to maintain.

328. What are the 2 distinct parts of syntax analysis and what do they do?

  • Lexical analysis: deals with small-scale language constructs such as name

  • Syntax analyzer: deals with large-scale constructs such as expressions

329. What are the 3 reasons for why lexical analysis is separated from syntax analysis?

  • Simplicity, Efficiency and Portability

330. What does the lexical analyzer do?

  • Collects input characters into groups (lexemes) and assigns an internal code (token) to each group.

331. How are lexemes recognized?

  • By matching the input against patterns.

#programming #programming-languages #computer-science-theory #computer-science #computer-science-student #data science

Sasha  Lee

Sasha Lee

1624601340

From Science to Data Science

  1. Introduction and Hypothesis

I loved to work as a scientist. There is a deep feeling of completion and happiness when you manage to answer why. Finding out why such animal would go there, why would they do this at that time of the year, why is that place so diverse… This applies to any kind of field. This is the reason why I want to advocate that if you are a scientist, you might want to have a look at what is called Data Science in the technological field. Be aware, I will not dwell in the details of titles such as Data engineer, data analyst, data scientist, AI researcher. Here, when I refer to Data Science, I mean the science of finding insights from data collected about a subject.

So, back to our **_why. _**In science, in order to answer your why, you will introduce the whole context surrounding it and then formulate an hypothesis. “The timing of the diapause in copepods is regulated through their respiration, ammonia excretion and water column temperature”. Behaviour of subject is the result of internal and external processes.

In marketing, you would have to formulate similar hypothesis in order to start your investigation: “3-days old users un-suscribes due to the lack of direct path towards the check-out”. Behaviour of subject is the result of internal (frustration) and external (not optimized UE/UI) processes.

Although I would have wanted to put that part at the end, as for any scientific paper, it goes without saying that your introduction would present the current ideas, results, and hypotheses of your field of research. So, as a researcher, you need to accumulate knowledge about your subject, and you go looking for scientific articles. The same is true for techs as well. There are plenty of scientific and non-scientific resources out-there that will allow you to better understand, interpret and improve your product. Take this article, for instance, Medium is a wonderful base of knowledge on so many topics! But you could also find passionating articles on PloS One on Users Experience or Marketing Design and etc.

2. Material and Methods

As a Marine biologist and later an Oceanographer, I took great pleasure to go at the field and collect data (platyhelminths, fish counts, zooplankton , etc…). Then we needed to translate the living “data” into numeric data. In the technological industry, it is the same idea. Instead of nets, quadrats, and terrain coverage, you will setup tracking event, collect postbacks from your partners and pull third-parties data. The idea is the same, “how do I get the information that will help me answer my why”. So a field sampling mission and a data collection planning have a lot in common.

#ai #data-science #science #tech #data science #from science

Julie  Donnelly

Julie Donnelly

1598770012

History of Computing PtI

We learn and know (hopefully) a basic history of the world, particularly major events like the French revolution, the American Civil War, World War I, World War II (wow lots of wars), the Spaceage etc. It is important to understand the concepts of these and many other historical events. Being able to recall the start year or the exact details of how such events unfolded is one thing, but on a human level, it is more important to understand the rationale, lessons and philosophy of major events. Ultimately history teaches us what makes us innately human. Furthermore, understanding history helps us realise the how and why we operate in todayHistory provides the context for today. It makes today seem ‘obvious,’ ‘justifable’ and ‘logical’ given the previous events that unfolded.

So, following this thread of logic, understanding the history of computers should help us understand how we have got to today. A today when computers moderate much of our communication with one another. A today where computers and screens are stared at for many (and often a majority) of our waking hours (especially during Covid). A today where the thought of working, socialising or learning without a computer would be an affront and a disadvantage. Just as major events like World War II and the Cold War have greatly contributed to today’s political and social climate. I would argue computers influence just as much (if not more) of our daily lives.

Therefore it is important for us to understand the evolution of computers to understand where we may be heading in our relationship with computers.

I would like to preface that the following articles outlining the history of computers by saying this is in no way an exhaustive history of the origin of computers. Some major events have been glossed over while other meaningful contributions omitted entirely.

Whilst the thought of history for some may make the eyes automatically glisten over, I will try and make the following series as painless and exciting as possible. While I paint a story of linear progress of computation, this is hindsight bias in action. We like to create a story of history attributing certain importance to some events and not others when in reality as these events were unfolding (and continue to unfold) it was not always obvious what was a gigantic discovery. It is only now with some distance that we can appreciate past event. This means perhaps in ten years this recount will emphasis other features and neglect some of the stories today we find so foundational to computer’s creation.

With all this in mind let’s begin !!


The first computers

Since their inception computers have taken over human work by performing tedious, complex and repetitive tasks. Interestingly, t_he__ word computer initially described humans_!! Initially computers were humans (often women) who were able to perform complex mathematical computations — usually with pen and paper. Often teams would work on the same calculation independently to confirm the end results. It is interesting to note that initally when electronic computers were developed they were referred to as such — electronic computers. With time as electronic computers became more and more pervasive and powerful, it became the human computer that was deemed obsolete and inefficient. The electronic was dropped and now when we discuss computers we think of nothing else besides our gracefull and versalite electronic tools. It is important to keep computer’s mathematical origin in mind as we will see it only further emphasises the never imagined pervasiveness and uses of computers today.

Our story begins with the humble abacus, generally considered the first computer. When researching I was puzzled how an abacus could be considered a computer. Luckily my curiosity was settled by a quick Google search (thank you Google). Google was even able to suggest my search before I completed typing ‘Why is the abacus considered the first computer’! I ended up on trusty Quora where one users: Chrissie Nysen put things simply-“Because it is used to compute things.” Though estimates vary, the abacus is thought to originate in Babylon approximately 5000 years ago. The role of the abacus was to ease in simple mathematical calculations- addition, subtraction, division and multiplication. In this sense, we can consider the abacus as a simple calculator. As farming, produce and populations increased in size, the abacus allowed the educated to more easily manage logistics. After the abacus the first computer, computer’s evolution remained dormant for some time……

#history #computer-science #history-of-technology #computers #computer-history #data science

Program a Quantum Computer Today

Quantum computing is one of the most rapidly advancing technologies. Many companies and research labs are racing to deliver functional quantum hardware to the market as soon as they can. It is one of those fields where every little bit of progress is a significant advancement.

At the moment, there’s no perfect quantum computer that is capable of running promising algorithms, such as Shor’s and Grover’s algorithms. However, current quantum machines are advancing rapidly. IBM speculates that during the next decade, quantum computers will offer an undeniable advantage by solving many problems that are unsolvable on a classical computer.

In 2019, IBM proposed a metric to measure how capable and efficient a quantum computer is (on the hardware side), and they called it Quantum Volume (QV). QV is a number calculated based on different factors, such as the number of qubits in the computer, their connectivity, and the measurement error probability. For us to run real-life-sized algorithms on real hardware, we need a large QV. For reference, the highest QV device owned by IBM at the moment is 32.

On the software side, some researchers predict that the market need for quantum programmers will grow exponentially over the next decade. Companies such as Google, IBM, and Microsoft are putting in considerable effort and a massive amount of funds to train the next generation of quantum researchers/ programmers.

To program a quantum computer, you don’t need an advanced degree in physics or maths. In my opinion, you just need a good imagination.

#computer-science #programming #science #quantum-computing #technology #data science