Deep Learning Meets Physics

Deep Learning | 07 July 2019

I’m always fascinated by the matter around us which make my neurons think what’s inside matter? How matter formed at the first place? Who created or what created matter that exist now in our universe? Why there isn’t much anti-matter as there is matter in this universe? These fundamental questions are still being researched by the greatest human beings of our time across the world.

Space and time are modes by which we think, not conditions under which we live - Albert Einstein

CERN

When you seek answers for fundamental questions like these over the internet, you will find CERN in many of the links that Google provides. If you love science, then you must know the composition of our universe is found to be 73% dark energy, 23% dark matter, 3.6% intergalactic gas and the rest, visible matter that we see around us.

Predicted Composition of our Universe [source]

Scientists are CERN are researching about this composition of our universe and still they couldn’t understand what constitutes dark matter or dark energy. Before we talk about how deep learning fits in here, lets see some swashbuckling facts about CERN.

  • International organization straddling Swiss-French border, founded 1954 having world’s largest facilities for fundamental research in particle physics.
  • 23 members states - 1.1 B CHF budget (~1.1B USD)
  • 3000 members of personnel + 15,000 associated members from 90 countries.
  • Has Large Hadron Collider (LHC) - Largest machine in the world which is 27km long.
  • Fastest racetrack on Earth where protons travel at 99.9999991% of the speed of light.
  • Emptiest place in the solar system where particules circulate in the highest vacuum.
  • Hottest spot in the galaxy where lead ion collisions create temperatures 100,000x hotter than the hearth of the sun.
  • In 1989, Tim Berners-Lee proposed the creation of a distributed information system which is evolved into what we call the World Wide Web.
  • The World’s first web page was originated from CERN.

There are four major experiments done using the Large Hadron Collider (LHC) at CERN as shown in Figure 2. These are detectors with 100 million sensors that gather information about particles trajectory, electrical charge and energy.

  • ATLAS - A Toroidal LHC Apparatus
  • CMS - Compact Muon Solenoid
  • ALICE - A Large Ion Collider Experiment
  • LHCb - LHC-beauty

Other three experiments at LHC are

  • TOTEM - Total Cross Section, Elastic Scattering and Diffraction Dissociation
  • LHCf - LHC-forward
  • MoEDAL - Monopole and Exotics Detector At the LHC

Kindly visit this to know more about LHC.

Figure 2. Experiments at CERN - CMS, ALICE, ATLAS, LHCb

Ok, why do we need such a big instrumental setup to explore what’s inside an atom? Let’s understand why we need to figure out what’s inside an atom at the first place?

It’s assumed that everything that we see around us (sun, earth, moon, stars, trees, humans etc.,) began with the Big Bang. Before learning about the fundamental particles and forces of nature, please read Chronology of the Universe to understand the history, present and future of our universe so that you will come to a conclusion that finding the most fundamental particle (that makes up matter) and how forces of nature interact with that particle is what we need to find out.

Figure 3. Diagram of evolution of the (observable part) of the universe from the Big Bang (left) to the present
[source]

Above picture tells us, Big Bang happened at a specific point in time which converted energy into matter (made of particles). Using LHC, if we make proton beams to collide with each other near to speed of light, then we can discover what’s inside proton. The study of such sub-atomic particles is called particle physics which allow humans to seek answers to science’s most fundamental questions.

The Standard Model

Matter that we see around us is found to be made of few basic building blocks united by four fundamental forces in nature. You can read more about the standard model here.

Figure 4. Standard Model of Elementary Particles

It turns out that to find out these sub-atomic particles, LHC experiments generates petabyte of data per second, which means its an ultimate place to use Deep Learning algorithms. For example, an experiment at CMS (Compact Muon Solenoid) generates 40 million collisions per second (PB/s) which is filtered in real-time to 100,000 selections per second (TB/s) and 1,000 selections per second (GB/s), selecting potentially interesting events or triggers.

The complexity in these experiments is so huge that if you are a data-lover, it seems like a place for you to explore, analyze, visualize and find meaning out of these. Take a look at the data center numbers of CERN shown below.

Figure 5. CERN Data Center in Numbers

Research Papers

Some research work that I follow and read are collected below on application of deep learning algorithms for high-energy physics.

A jet multiclass classifier

A jet is a narrow cone of hadrons and other particles produced by the hadronization of a quark or gluon. Simple Deep Neural Nets on high-level features (jet masses, multiplicities, energy correlation functions) etc., can be used to create a jet multiclass classifier.

It’s very interesting to know Deep Learning algorithms such as Convolutional Neural Network (CNN), Recurrent Neural Network (RNN), Bi-directional RNN, Long-Short Term Memory (LSTM) network, Gated Recurrent Units (GRU) and Generative Adversarial Network (GAN) are used in LHC experiments to analyze and get insights about the nature of our universe.


ML Challenge

On 4 July 2012, CMS and ATLAS experiments at LHC confirmed the discovery of Higgs Boson. How a particle decays into other particles is the key factor to understand and measure the characteristics of the particle. It’s confirmed that Higgs Boson decays into two tau particles which is a small signal buried in background noise.

Recently, I became aware of Higgs Boson Machine Learning Challenge conducted by CERN and Kaggle on 2014. Here is the link to the challenge - Higgs boson machine-learning challenge.

If you know machine learning, it’s enough to participate in this competition as the task is to classify ATLAS events as tau tau decay of a higgs boson or background noise using the features characterizing the events.

For a beginner like me who is interested in applying ML/DL for high-energy physics, I found this as the perfect start!


Tools used for DL

It’s amazing to hear that python and its ecosystem is used for data analytics in CERN LHC. Similar to how the image of black hole was created using Python and its ecosystem, we could use the same to understand more about our universe.

Some of the tools used at CERN to do data analytics are

  • Jupyter Notebook
  • Apache Spark
  • Apache Kafka
  • Analytics Zoo
  • BigDL
  • HDFS
  • TensorFlow
  • Keras
  • NumPy
  • SciPy
  • Pandas
  • Matplotlib

References

In case if you found something useful to add to this article or you found a bug in the code or would like to improve some points mentioned, feel free to write it down in the comments. Hope you found something useful here.