Meggie  Flatley

Meggie Flatley

1595146380

Brain Computer Interfaces: The essential role of science fiction

The brain is the final frontier of human anatomy. That isn’t to say that we understand all that is happening in every other part of our body. But for the most part I can tell you how the muscles in my arm work and what my stomach does when I eat a burrito. We can build an artificial kidney, a robotic arm, or even grow a new heart, but this is not true of the brain. The brain is an incredibly complex organ. Each of its 100 billion cells connect to thousands of others, creating over 100 trillion connections. This complex web depends on precise timings and electro-chemical processes that we barely understand the basic science behind. It’s no wonder that we haven’t yet grasped it all.

This article covers the history of brain computer interfaces in science fiction and compares it to the science of the time, showing the interactions between the two.

Early Medicine

Naturally, humans have always been fascinated by the brain, at least as since we have understood its importance. Early Egyptians regarded the brain as ‘cranial stuffing’. It was something that could become problematic and cause headaches but not worth any other thought. It is similar to how we think about the appendix now. Instead, the idea at the time was that the heart was responsible for our thoughts and feelings.

It wasn’t until a bit later (162 AD) that the physician and anatomist Galen looked at the soldiers coming in for treatment and thought, “All these people getting hit in the head with swords aren’t thinking straight. Maybe the brain is responsible for our thoughts.” He was, of course, banned from pursuing that line of thinking.

Later in the 1500’s, Vesalius published De Humani Corporis Fabrica (The Fabrics of the Human Body). This book was considered the foundation of modern anatomy. In it he proposed that the brain was responsible for sensation and movement. He proposed that it acted through the network of nerves that stretched from the brain and throughout the human body. This was a monumental milestone in the development of neuroscience.

Image for post

Early Machines and Stories About Them

Humans have been thinking about thinking for a long time, but we have been captivated by machines even longer. In fact, the use of tools stretches so far back into human history it technically predates homo sapiens as a species. But what do humans do to the things we love? We personify them. From the beginning of history, we see men making machines in their own image. Throughout history, we see the creation of puppets and complex statues that use mechanics to mimic human movement or sound. We also get fantastical descriptions of mechanical beings, or automatons, that mimic people.

#history-of-technology #science-fiction #future #science #brain-computer-interface

What is GEEK

Buddha Community

Brain Computer Interfaces: The essential role of science fiction
Meggie  Flatley

Meggie Flatley

1595146380

Brain Computer Interfaces: The essential role of science fiction

The brain is the final frontier of human anatomy. That isn’t to say that we understand all that is happening in every other part of our body. But for the most part I can tell you how the muscles in my arm work and what my stomach does when I eat a burrito. We can build an artificial kidney, a robotic arm, or even grow a new heart, but this is not true of the brain. The brain is an incredibly complex organ. Each of its 100 billion cells connect to thousands of others, creating over 100 trillion connections. This complex web depends on precise timings and electro-chemical processes that we barely understand the basic science behind. It’s no wonder that we haven’t yet grasped it all.

This article covers the history of brain computer interfaces in science fiction and compares it to the science of the time, showing the interactions between the two.

Early Medicine

Naturally, humans have always been fascinated by the brain, at least as since we have understood its importance. Early Egyptians regarded the brain as ‘cranial stuffing’. It was something that could become problematic and cause headaches but not worth any other thought. It is similar to how we think about the appendix now. Instead, the idea at the time was that the heart was responsible for our thoughts and feelings.

It wasn’t until a bit later (162 AD) that the physician and anatomist Galen looked at the soldiers coming in for treatment and thought, “All these people getting hit in the head with swords aren’t thinking straight. Maybe the brain is responsible for our thoughts.” He was, of course, banned from pursuing that line of thinking.

Later in the 1500’s, Vesalius published De Humani Corporis Fabrica (The Fabrics of the Human Body). This book was considered the foundation of modern anatomy. In it he proposed that the brain was responsible for sensation and movement. He proposed that it acted through the network of nerves that stretched from the brain and throughout the human body. This was a monumental milestone in the development of neuroscience.

Image for post

Early Machines and Stories About Them

Humans have been thinking about thinking for a long time, but we have been captivated by machines even longer. In fact, the use of tools stretches so far back into human history it technically predates homo sapiens as a species. But what do humans do to the things we love? We personify them. From the beginning of history, we see men making machines in their own image. Throughout history, we see the creation of puppets and complex statues that use mechanics to mimic human movement or sound. We also get fantastical descriptions of mechanical beings, or automatons, that mimic people.

#history-of-technology #science-fiction #future #science #brain-computer-interface

Uriah  Dietrich

Uriah Dietrich

1618449987

How To Build A Data Science Career In 2021

For this week’s data science career interview, we got in touch with Dr Suman Sanyal, Associate Professor of Computer Science and Engineering at NIIT University. In this interview, Dr Sanyal shares his insights on how universities can contribute to this highly promising sector and what aspirants can do to build a successful data science career.

With industry-linkage, technology and research-driven seamless education, NIIT University has been recognised for addressing the growing demand for data science experts worldwide with its industry-ready courses. The university has recently introduced B.Tech in Data Science course, which aims to deploy data sets models to solve real-world problems. The programme provides industry-academic synergy for the students to establish careers in data science, artificial intelligence and machine learning.

“Students with skills that are aligned to new-age technology will be of huge value. The industry today wants young, ambitious students who have the know-how on how to get things done,” Sanyal said.

#careers # #data science aspirant #data science career #data science career intervie #data science education #data science education marke #data science jobs #niit university data science

77 Programming Language Q&A (P4)

The following article deals with questions/answers you may encounter when asked about Lexical and Syntax Analysis. Check the bottom of the page for links to the other questions and answers I’ve come up with to make you a great Computer Scientist (when it comes to Programming Languages).

324. What are the 3 approaches to implementing programming languages?

  • Compilation, Pure Interpretation and Hybrid Implementation

325. What is the job of the syntax analyzer?

  • Check the syntax of the program and create a parse tree.

326. What are the syntax analyzers based on?

  • Formal description of the syntax of programs, usually BNF.

327. What are some of the advantages of using BNF?

  • Descriptions are clear and concise.

  • Syntax analyzers can be generated directly from BNF.

  • Implementations based on BNF are easy to maintain.

328. What are the 2 distinct parts of syntax analysis and what do they do?

  • Lexical analysis: deals with small-scale language constructs such as name

  • Syntax analyzer: deals with large-scale constructs such as expressions

329. What are the 3 reasons for why lexical analysis is separated from syntax analysis?

  • Simplicity, Efficiency and Portability

330. What does the lexical analyzer do?

  • Collects input characters into groups (lexemes) and assigns an internal code (token) to each group.

331. How are lexemes recognized?

  • By matching the input against patterns.

#programming #programming-languages #computer-science-theory #computer-science #computer-science-student #data science

Sasha  Lee

Sasha Lee

1624601340

From Science to Data Science

  1. Introduction and Hypothesis

I loved to work as a scientist. There is a deep feeling of completion and happiness when you manage to answer why. Finding out why such animal would go there, why would they do this at that time of the year, why is that place so diverse… This applies to any kind of field. This is the reason why I want to advocate that if you are a scientist, you might want to have a look at what is called Data Science in the technological field. Be aware, I will not dwell in the details of titles such as Data engineer, data analyst, data scientist, AI researcher. Here, when I refer to Data Science, I mean the science of finding insights from data collected about a subject.

So, back to our **_why. _**In science, in order to answer your why, you will introduce the whole context surrounding it and then formulate an hypothesis. “The timing of the diapause in copepods is regulated through their respiration, ammonia excretion and water column temperature”. Behaviour of subject is the result of internal and external processes.

In marketing, you would have to formulate similar hypothesis in order to start your investigation: “3-days old users un-suscribes due to the lack of direct path towards the check-out”. Behaviour of subject is the result of internal (frustration) and external (not optimized UE/UI) processes.

Although I would have wanted to put that part at the end, as for any scientific paper, it goes without saying that your introduction would present the current ideas, results, and hypotheses of your field of research. So, as a researcher, you need to accumulate knowledge about your subject, and you go looking for scientific articles. The same is true for techs as well. There are plenty of scientific and non-scientific resources out-there that will allow you to better understand, interpret and improve your product. Take this article, for instance, Medium is a wonderful base of knowledge on so many topics! But you could also find passionating articles on PloS One on Users Experience or Marketing Design and etc.

2. Material and Methods

As a Marine biologist and later an Oceanographer, I took great pleasure to go at the field and collect data (platyhelminths, fish counts, zooplankton , etc…). Then we needed to translate the living “data” into numeric data. In the technological industry, it is the same idea. Instead of nets, quadrats, and terrain coverage, you will setup tracking event, collect postbacks from your partners and pull third-parties data. The idea is the same, “how do I get the information that will help me answer my why”. So a field sampling mission and a data collection planning have a lot in common.

#ai #data-science #science #tech #data science #from science

Julie  Donnelly

Julie Donnelly

1598770012

History of Computing PtI

We learn and know (hopefully) a basic history of the world, particularly major events like the French revolution, the American Civil War, World War I, World War II (wow lots of wars), the Spaceage etc. It is important to understand the concepts of these and many other historical events. Being able to recall the start year or the exact details of how such events unfolded is one thing, but on a human level, it is more important to understand the rationale, lessons and philosophy of major events. Ultimately history teaches us what makes us innately human. Furthermore, understanding history helps us realise the how and why we operate in todayHistory provides the context for today. It makes today seem ‘obvious,’ ‘justifable’ and ‘logical’ given the previous events that unfolded.

So, following this thread of logic, understanding the history of computers should help us understand how we have got to today. A today when computers moderate much of our communication with one another. A today where computers and screens are stared at for many (and often a majority) of our waking hours (especially during Covid). A today where the thought of working, socialising or learning without a computer would be an affront and a disadvantage. Just as major events like World War II and the Cold War have greatly contributed to today’s political and social climate. I would argue computers influence just as much (if not more) of our daily lives.

Therefore it is important for us to understand the evolution of computers to understand where we may be heading in our relationship with computers.

I would like to preface that the following articles outlining the history of computers by saying this is in no way an exhaustive history of the origin of computers. Some major events have been glossed over while other meaningful contributions omitted entirely.

Whilst the thought of history for some may make the eyes automatically glisten over, I will try and make the following series as painless and exciting as possible. While I paint a story of linear progress of computation, this is hindsight bias in action. We like to create a story of history attributing certain importance to some events and not others when in reality as these events were unfolding (and continue to unfold) it was not always obvious what was a gigantic discovery. It is only now with some distance that we can appreciate past event. This means perhaps in ten years this recount will emphasis other features and neglect some of the stories today we find so foundational to computer’s creation.

With all this in mind let’s begin !!


The first computers

Since their inception computers have taken over human work by performing tedious, complex and repetitive tasks. Interestingly, t_he__ word computer initially described humans_!! Initially computers were humans (often women) who were able to perform complex mathematical computations — usually with pen and paper. Often teams would work on the same calculation independently to confirm the end results. It is interesting to note that initally when electronic computers were developed they were referred to as such — electronic computers. With time as electronic computers became more and more pervasive and powerful, it became the human computer that was deemed obsolete and inefficient. The electronic was dropped and now when we discuss computers we think of nothing else besides our gracefull and versalite electronic tools. It is important to keep computer’s mathematical origin in mind as we will see it only further emphasises the never imagined pervasiveness and uses of computers today.

Our story begins with the humble abacus, generally considered the first computer. When researching I was puzzled how an abacus could be considered a computer. Luckily my curiosity was settled by a quick Google search (thank you Google). Google was even able to suggest my search before I completed typing ‘Why is the abacus considered the first computer’! I ended up on trusty Quora where one users: Chrissie Nysen put things simply-“Because it is used to compute things.” Though estimates vary, the abacus is thought to originate in Babylon approximately 5000 years ago. The role of the abacus was to ease in simple mathematical calculations- addition, subtraction, division and multiplication. In this sense, we can consider the abacus as a simple calculator. As farming, produce and populations increased in size, the abacus allowed the educated to more easily manage logistics. After the abacus the first computer, computer’s evolution remained dormant for some time……

#history #computer-science #history-of-technology #computers #computer-history #data science