From an introduction to modeling for medical diagnosis. I am feeling sick. Fever. Cough. Stuffy nose. And it’s wintertime. Do I have the flu? Likely. Plus I have muscle pain. More likely.
I am feeling sick. Fever. Cough. Stuffy nose. And it’s wintertime. Do I have the flu? Likely. Plus I have muscle pain. More likely.
Bayesian networks are great for these types of inferences. We have variables, some whose values have been fixed. We are interested in the probabilities of some free variables given these fixed values.
In our example, we want the probability that we have the flu, given some symptoms we have observed, and the season we are in.
So far it looks like reasoning with conditional probabilities. Is there more to it? Yes. A lot more. Let’s scale up this example and it will come out.
Towards A Large-scale Bayes Network
Imagine that our network models every possible symptom, every possible disease, outcomes of every possible medical test, and every possible external factor that might potentially affect the probability of some disease. External factors break down into behavioral ones (smoking, being a couch potato, eating too much), physiological ones ( weight, gender, age), and others. For good measure, let’s also throw in treatments. And side-effects.
By now there is enough and useful medical knowledge to capture tens of thousands of variables (at the very least) and their interactions. For any set of symptoms, together with the values of some of the behavioral, physiological, and other external factors, we could estimate the probabilities of various diseases. And more. For a given disease, we could ask it to give us the most likely symptoms. And way more. Such as I have a cough and high fever but the flu has been diagnosed out, what other diseases are likely? For a given diagnosis, and our particular symptoms, and possibly additional factors such as our gender and age, we could ask it to recommend treatments.
Now we are getting somewhere. How does all this magic work? This is what we will explore here.
Probabilistic graphical model (PGM) provides a graphical representation to understand the complex relationship between a set of random variables (RVs).
This chapter continues the series on Bayesian deep learning. In the chapter we’ll explore alternative solutions to conventional dense neural networks.
Introduction to the Markov Chain, Process, and Hidden Markov Model. The concept and application of Markov chain and Hidden Markov Model in Quantitative Finance.
Using deep learning and neural networks using Bayesian Optimization for customer churn. A small insurance company, Texas Giant Insurance (TGI) focuses on providing commercial.
Artificial Neural Networks — Recurrent Neural Networks. Remembering the history and predicting the future with neural networks. A intuition behind Recurrent neural networks.