At the LHC, physicists need to process large amounts of data in short time. This is typically done with rule-based algorithms designed through domain knowledge, which can reach very large accuracy at the cost of big execution time. The emergence of Deep Learning as a tool to process raw data is opening new possibilities, both in terms of accuracy and latency. In this lecture, I will review a set of examples showing how Deep Learning could be used in typical event reconstruction problems at the LHC. Besides offering a cost-effective alternative to classic algorithms, Deep Learning offers the opportunity to explore new methods to probe the existence of physics beyond the Standard Model.