EUROPEAN WOLFRAM TECHNOLOGY CONFERENCE 2018 14-15 June 2018| 9-6pm| Oxford

Thank you to all the participants and speakers for making the 2018 European Wolfram Technology Conference a success.

Scheduled Talks

Enhancing the Wolfram Language Tom Wickham-Jones
The Wolfram Compiler is a new technology to enhance the core of the Wolfram Language. This talk will discuss the benefits that a compiler can bring and show the latest advances.
Modelling Socioeconomic Welfare Systems Niels J. Sørensen
The forecasting of financial risks for various types of financial institutions has received increased interest during the last few years due to the large economic setbacks experienced in the western world after the financial crisis. Thus, the application of risk-scenario simulations and solvency requirements has become common practice. The goal with such types of forecasting has been to provide a suitable set of tools for decision makers, thus allowing appropriate "management actions" to be taken in due course. A class of socioeconomic and insurance models useful for forecasting is discussed in this talk. In particular, a socioeconomic Markov-type model is used as a generic model for the development and application of computational methods useful in demonstrating the modelling of all the large socioeconomic states and their development in the welfare system of a modern municipality. The aim here is to present a systematic approach for the simulation of various trends and efforts in the socioeconomic system of a modern municipality. In this overview, we discuss principles for: (i) the grouping of the observations of a municipality over a period of 15 years into a time series representing the jumps and sojourns of individuals into different states; and for (ii) the estimation of the parameters of a Markov model. The particular choice of socioeconomic states is selected, with a focus on the effect of the coupling between education level and chronic illness, while maintaining the general capability of the model to capture the general features of the welfare system. The model tends to capture the general overall trends in the behavior of future years based on forecasting using constant intensities. Traditional modelling of Scandinavian life-insurance products is also mentioned.
View Presentation
Recreational Coding Jon McLoone
Much of the focus in data science tends to be the application of traditional statistics techniques in more automated ways. This talk will discuss how this automation now makes it possible for data scientists to apply much more sophisticated computation to their data to achieve deeper insights and derive more value from the data. Computational techniques originally developed for science and engineering applications can often be applied, in sometimes surprising ways, to big data applications. During the talk, we'll see how techniques from machine learning, to image processing, to signal processing are now easy to apply.
View Presentation
Introducing Wolfram CloudConnector for Excel Anthony Zupnik
Wolfram CloudConnector for Excel is a product designed to bring the computational power of the Wolfram Language to Excel without the hassle of having to learn a new programming language. Computations are delivered from the Wolfram Cloud using Wolfram APIs, powerful tools for creating and distributing Wolfram functionality via web requests. Learn how simple it is to create, develop and deploy a Wolfram API with a livecoding demonstration. In this way, you can integrate unique and powerful functions with the familiar environment of an Excel spreadsheet.
Technology Created for Computer-Based Maths Gerli Jõgeva View Presentation
What's New in the World of Cloud Notebooks Jan Poeschko
After a general introduction to cloud notebooks, we will talk about recent developments and what's coming ahead. Our focus has been on performance and stability, but we have also worked on new features such as multi-click selection, a spellchecker, more beautiful control rendering and TeX input.
View Presentation
Developing a Cloud Application Jan Poeschko
This is a demo showing how to build a little web application using notebooks in the Wolfram Cloud. We will dive into some code implementing a "static site generator" and discuss a few general things to consider when deploying applications to the cloud.
View Presentation
Sandpile Animations Roman Maeder
A simple simulation consists of a square or hexagonal array of cells, each holding a number of grains of sand. Any pile containing more than three or five grains becomes unstable and sheds one grain to each of its immediate neighbours. These cells may then themselves contain too many grains and shed grains to their neighbours, and so on. The evolution of this system eventually terminates, and the result is independent of the order in which the redistribution process is carried out.

We look at the fascinating toppling sequences that arise from various initial conditions, leading to fractal images, and discuss efficient code for generating animations and images with millions of grains of sand.
View Presentation
Bayesian Inference in the Wolfram Language Sjoerd Smit
This presentation shows how both analytical and numerical functionalities in the Wolfram Language can be used to implement Bayesian data analysis methods. This presentation shows how both analytical and numerical functionalities in the Wolfram Language can be used to implement Bayesian data analysis methods.
View Presentation
Solving Nonlinear Differential Equations Flavio Sartoretto
Mathematica is great in solving analytically linear differential equations. It is also worth exploiting for computing numerical solutions to nonlinear equations. We attack the reduced-gravity, shallow-water equation problem. We compare the analytical solution to the first-order problem without viscosity to the numerical solution obtained either via Mathematica or via MATLAB.
View Presentation
Deep Learning in the Wolfram Language Giulio Alessandrini
The Wolfram Language contains a scalable, industrial-strength and easy-to-use neural net framework. In this talk, we are going to demonstrate its capabilities and its integration with the rest of the language. We are going to show how to build and train a net model from scratch, as well as work with a ready-to-use model from the Neural Net Repository. Last, we would like to cover some of our short- to mid-term-future plans.
View Presentation
Neural Networks Solutions for Image and Audio Processing Giulio Alessandrini
Image processing and—more recently—audio processing have being drastically affected by new neural networks–based pipelines. Tasks like unsupervised classification, feature extraction, data imputation, object detection and image recognition have all improved in accuracy, speed or both. With practical examples and an attention to real-world applications, this talk will show how to use the Wolfram Language deep learning framework to perform advanced data processing.
View Presentation
Conversational Agents for Generation of Machine Learning Workflows Anton Antonov
In this talk, we are going to discuss the design principles, structure and implementation of several different conversational agents that produce Wolfram Language code for machine learning workflows. The conversational agents are based on: (1) finite state machines; (2) natural language context-free grammars; and (3) monadic programming domain-specific languages. We are going to look into conversational agents generating workflows for data records classification, time series analysis, text-latent semantic analysis and database browsing and selection.
View Presentation
Patient Critical Conditions Prediction Frameworks: A Comparison of Two Approaches Anton Antonov
In this presentation, we consider the design and implementation of two software frameworks for patient critical conditions prediction designed and implemented in two principally different ways. The first is designed using object-oriented programming design patterns and implemented in R. The second is designed using monadic programming and implemented in the Wolfram Language. Together with the design and implementation, the comparison is done by considering the frameworks' target end users and extension developers. Each framework is endowed with a different interface: the first uses an interactive dashboard design, while the second uses a domain-specific language.
View Presentation
The Wolfram Language and IOT Devices Mark Braithwaite
A tour of the Wolfram Language's capabilities with IOT devices. This will be an example-based focus on the Raspberry Pi which will showcase how the Wolfram Language's integration and machine learning capabilities allow you to make a small but elegant gadget. 
Analysing Google Search Results Prior to the German Election in 2017 Michael Gamer
Recently I have been working on a research project in cooperation with the University of Kaiserslautern, Germany, regarding the Google search results during the election in Germany in September 2017 (called project „Datenspende" in Germany). More than 4,000 people joined this project in the weeks before the German election. They installed a plug-in in their browsers, and then every four hours this plug-in fired searches regarding the main players in the German election (the parties: CDU, SDP; and the persons: Angela Merkel, Martin Schulz; and we got the search results Google produced—therefore, „Datenspende," which is "donation of data"). Currently we are on the way to write the final report on whether we see a personalization or a regionalization in the search results. In this project, I am doing the data analysis using Mathematica (building up the datasets—we have approximately six million results to process) and analysing the data in various ways.
Mathematical Models and Point Clouds in Architecture Corrado Falcolini

Datasets coming from modern surveys are basically made by a very long list of coordinates of points (a "point cloud") detected on the surface of the object under study. Each point can be associated to some local feature: color (photogrammetric survey), reflectance (an index of material reflection from a laser scanner survey) or orientation (average normal to the closest faces of the triangulated surface through the points), but point coordinates are often enough to evaluate its 3D geometrical model and feature.

Given such a discrete point cloud representation of a surveyed object, it is then possible to analyze its three-dimensional features with the help of several mathematical models: parametric surfaces and curves, planar sections, segmentation or matching procedures and tessellation.

We present a few applications of original algorithms, written in Mathematica, to specific cases of interest in architecture and cultural heritage, from the detection of an optimal fluting column model in the virtual reconstruction of the Arch of Titus at the Circus Maximus in Rome to the contact probability of two possibly adjacent fragments in the recomposition and restoration of the fragmented S. Andrea statue at Stiffe, L'Aquila, the shape and tessellation of Borromini's San Carlino dome or the bricks detection and analysis of the masonry type of Villa dei Misteri in Pompeii.

The Inclusion of the Arts in Science with the Help of New Technologies Paulina Toimil Davila
Understanding of scientific knowledge is for most people a very difficult or repulsive activity; sciences are perceived as a type of knowledge that only "geniuses" can access. This misconception is based on an educational backwardness, because not all educators have the sensitivity to transmit scientific knowledge. To solve this problem, we need a new way of teaching science and new ways of transmitting scientific knowledge. Now, we can do it through human sensitivity, with gratifying sensory experiences such as art. Apparently, science and art have nothing to do with each other; however, both converge on attributes that are essential. With the new technologies and programming languages, we can make this correlation evident; an example would be the Wolfram Language, where we can "give color to a number" or listen to the sound of a quantum physics equation... The importance of the inclusion of art in the sciences rests on these innovative ways of transmitting and teaching scientific knowledge, and on programming applied to mathematical models to generate a new type of science education that breaks paradigms.
View Presentation
Interactive Digital Document for Intermediate Microeconomics Courses Loreto Llorente, Javier Puértolas
This presentation shows an interactive digital document in CDF format built entirely with Mathematica. The document is an intermediate microeconomics course in which everything is integrated into the same document with full interactivity.

The whole document is interactive. Several controls enable the reader to interact with most graphics. Definitions are available on request, and throughout the text you can decide whether to use calculus or not. There is a predefined order of concepts and activities to be followed by the reader, but you can also construct your own path using the contents, the index and the links within the text.
View Presentation
Entropy Measures in Compositions of Sync/Async Boolean Networks Tommaso Bolognesi

Boolean networks ("Bool nets") are sequential, dynamical systems introduced by S. Kauffman for modelling biological genetic systems. They are a generalisation of cellular automata.

In this talk, we illustrate two alternative ways—shared-state variables and shared labeled transitions—in which two Bool nets P and Q can be composed to form a new system with a user-controlled degree of coupling.

Then we report on a number of experiments we have carried out with Mathematica, whose purposes are:
(i) to specify and simulate synchronous and asynchronous, simple or composite Bool nets;
(ii) to collect statistical data on informational measures such as (conditional) entropy and (conditional) mutual information, applied to computations of the considered model; and
(iii) to relate the collected statistical informational values to the degree of coupling of the compositions and to the computation lengths.
View Presentation
Insights from Teaching Wolfram Language Programming Maik Meusel
This presentation aims to share insights gained from teaching a Wolfram Language programming boot camp for master's students at the University of Zurich.
Using a blended learning approach, students practiced coding at home, and questions were discussed during class sessions. To pass the course, within three weeks eight individual coding assignments and a final exam had to be completed.
To handle a large class of more than 60 students, we used a Wolfram Language–based LMS for the creation and distribution of course materials, and automated grading of all assessments.
Despite sharing experience from teaching this course, students' feedback and data will be analyzed.
Using Wolfram|Alpha and Mathematica to Enhance Teaching, Learning, Assessment, Research and Employability Stephen Lynch
In the UK, mathematics is the most popular subject at A-Level; however, it is not a popular subject to study at the university level. Can the use of mathematics packages make it a more popular subject to study at A-Level and at university? At MMU, we wanted to attract and retain mathematics students and prepare them for careers upon graduation. By integrating maths/stats packages across the curriculum and by solving real-world problems, we have managed to make the course highly desirable and loved by our students.
Automated Human-Like Reasoning with Theorema 2.0 Wolfgang Windsteiger

Theorema is a system aimed at the support of mathematical theory exploration. The core activities in theory exploration are defining new mathematical concepts and proving their properties. The main focus of Theorema is on human-like style in both input and output. This means that mathematical input should appear as though written in a good math textbook, and proofs generated by the system should read as written by a smart student.

Theorema uses programmable expression syntax in Mathematica to define its own version of a mathematical language based on sets, tuples and common constructs of predicate logic. Customized cascading stylesheets allow one to compose entire theories in notebooks with enhanced functionality and appealing style. The reasoning engine is based on a deduction calculus in natural style making heavy use of syntactic and semantic pattern matching as available in Mathematica for decades. Theorema is developed as a standard Mathematica add-on package and is available free of charge under a GNU GPL license.

The applications of the Theorema system reach from higher-level education to research mathematics. The more computational thinking brings algorithms into the classrooms, the more will correctness of algorithms become one of the key topics for future generations of students. Formal trusted proving will be the enabling technique in this area—but also for pure mathematics, the significance of formal reasoning will increase. The Theorema system has recently been used for a formalization of second-price auctions in theoretical economics.

On the Package ReactionKinetics.wl János Tóth, Attila László Nagy, Dávid Papp
When investigating the deterministic and/or stochastic models of chemical reactions, a large amount of calculations have to be carried out. The preliminary steps are creating and studying different graphs describing reactions. This involves the use of combinatorics and linear algebra. One is also interested if mass is conserved in a model or not. This can be decided using the methods of linear programming. Stationary points and stationary distributions are to be determined, which means the solution of large polynomial equations. One has to solve (quite often stiff) ordinary differential equations and simulate Markovian jump processes. Parameters of these processes are to be estimated based on measurements even in cases when not all the concentration time curves are known. This problem is implicit and highly nonlinear. Our package gives help to all these tasks arising in chemistry (atmospheric chemistry), biochemistry (modeling metabolism) and chemical engineering (combustion), but the models of chemical reaction kinetics are used outside chemistry as well. We show the problems and solution methods from the point of programming, and also a series of applications. A comparison with other programs will also be presented.
Enterprise Behavioural Analytics: Applied Machine Learning Stephen Christie

Financial and economic systems require a model of agent behaviour. In many cases, these models have been known and systematically proven to be inherently naive but useful, for example through tractability or transparency. As more applications require increased use of complex automated decisioning and as data complexity increases, many of these simplifications become unacceptable. Behavioural science applications are becoming more widely adopted in fields such as surveillance and intelligent decisioning. This is an important application for which enterprise behavioural analytics, combined with enterprise dynamics, enables deeper applied use cases to be implemented. This is also a multi-industry opportunity.

EBA has recently been integrated into Nasdaq's trade surveillance armoury, and is in the process of being applied to AI applications in medtech and agritech.

This presentation outlines the logic associated with the development of EBA and poses future use cases and areas of research.

Connecting the Wolfram Language to the IoT World: Methods, Example Code and Benchmarks Pedro Fonseca
Different communication and messaging protocols will be explored: MQTT, ZMQ, HTTP, CoAP, PHP (POST and GET), direct database link (e.g. MySQL), UDP, etc. Example code will be presented for continuous data transmission/feed ("real time"), on-demand data or action request, local storage followed by batch transmission, etc., between the Wolfram Language (Mathematica as server or client), microcontrollers (e.g. ESP32 as server or client), wiring technologies (e.g. Node-RED), data accumulators (e.g. Wolfram Data Drop) and particular programming environments (e.g. MicroPython). Benchmarks will be presented for the different examples.
Rapid Prototyping of Industrial Applications with the Wolfram Language Stefan Janecek

The Wolfram Language is the perfect tool for rapidly building prototypes for industrial applications: its symbolic nature allows one to quickly capture almost any problem, concept or new idea in a succinct, unified way, making it available for computation. The wide range of functions and built-in knowledge allow the developer to concentrate on the goals he wants to achieve, without wasting time on technical details. Mathematica's notebook interface and visualization capabilities foster iterative development and short feedback loops.

In this talk, we will show real-world examples of industrial application prototypes we developed during the past few years in very different areas, such as electrical engineering (designing transformers), material science (density functional theory) and computational finance. Our focus is on highlighting the path from the idea to the solution, with surprisingly few lines of code.

View Presentation
Interdisciplinary Teaching and Learning with Wolfram Mathematica Roberto Cavaliere

After 30 years of development at the frontier of research, Wolfram's technology now offers a wide range of functionalities, many of which offer hints for original and novel educational materials. Mathematical knowledge is just the starting point. Adding interactive interfaces, powerful visualisation features and databases of rich information from almost any known discipline provide unique potential for attractive lessons at any educational level. Considering that all these features can be exploited on a computer, a tablet or a mobile phone and can run standalone or in the cloud, there is no excuse to continue without them.

This talk will show examples of how to approach different subjects using some of the most powerful features of the Wolfram Language for different levels of education. Kids can improve their scientific intuition by working with interactive demos on easy concepts and/or small games based, for example, on simple text analysis or even on geometric manipulations. Middle-school students can start to work out with coding or even use a little more complex functionality, like data exploration in English. In high schools, teachers can use the Wolfram Language to deal with mathematics, science, biology, chemistry and even geography or literature. Finally, at the university level, there is nothing to add. Any subject can benefit from all functionalities available in Mathematica, Wolfram|Alpha, SystemModeler or any other Wolfram technology.

View Presentation
Simulation Apps with the Wolfram Language Simone Ferrero
Nova Analysis is a consulting company in the field of numerical simulation for engineering systems. In this speech, a few examples of applications developed using the Wolfram Language to support simulation are presented. In order to improve simulation preparation performances, it is useful to collect only mandatory data, avoiding repetitive and non-useful tasks. Applications in Mathematica permit switching between different simulation sectors and reusing of useful and efficient functions, both native and user-defined. So, starting from simple tools for experimental data visualisation and 1D equations, Modelica objects and control algorithms, apps to support and drive analyses will be described. Moreover, built-in Mathematica functions permit running of rapid, efficient and effective simulations as 2D FEM analyses and 3D modal analyses. Some examples will be discussed.
On One Approach to Numerical Solutions of Nonlinear PDEs Exhibiting Soft Bifurcations Alexei Boulbitch
This presentation reports a numerical pseudo-dynamic approach to solving nonlinear stationary equations exhibiting bifurcations by passing from the stationary partial differential equation to a pseudo-time-dependent one. We construct the latter in such a way that the desired nontrivial solution of the stationary equation represents its fixed point. The sought numeric solution of the stationary equation is then obtained as the solution of the pseudo-time-dependent equation at a high enough value of the pseudo-time.
View Presentation
Cosmic Rays Simulations for Testing Particle Detectors at CERN Nicola Zurlo

Secondary cosmic rays (often referred to just as "cosmic rays") constitute a continuous flux of particles produced at the top of the atmosphere, and mainly consist of muons reaching every point on the Earth's surface with quite consistent statistical properties.

These freely available and abundant particles are extremely valuable for testing/calibrating charged particle detectors (even the largest ones at CERN), in combination with a Monte Carlo simulation of the apparatus (usually in C++/Geant4), including a realistic cosmic ray joint distribution of theta/phi/energy (in the literature).

The straightforward way to perform these simulations is by randomly generating such cosmic rays in a horizontal plane above the detectors and following their way through them. But since several rays are tilted, it is essential to generate them on a surface much larger than the apparatus area, resulting in most of the tracked rays being "wasted."

To overcome this issue, we've developed a new approach: we've implemented a model where they are generated on a half-sphere enclosing the apparatus, with a convenient distribution of "source points" and a suitable angular/energy distribution, so that for every point inside the half-sphere all the cosmic rays' statistical properties hold.

The conversion from the usual "planar" three-variate PDF to the "half-spherical" five-variate PDF needs tricky analytical calculations that have been performed with Mathematica. The new distribution was eventually implemented in the C++ code to check for consistency with the usual planar-source results.

The "half-sphere" method allows a tenfold reduction in the Monte Carlo computational time with no detriment to accuracy.

View Presentation
PlateMod—Software Tools for Premium-Quality Heavy Plate Production Built with Wolfram Technologies Erik Parteder, Michael Liebrecht

The thermomechanical processing of heavy steel plates starts with the reheating of the slabs, followed by hot rolling and accelerated cooling, and completes with levelling. In recent years, voestalpine Grobblech in Linz has made great efforts to model the entire production chain with physically based models. Together with prediction tools for the mechanical properties, advanced process automation strategies are developed and applied. This strategy uses dynamic adaption of target values and is named "PlateMod control." With this method, the standard deviation of mechanical properties (e.g. tensile strength) of a series with a large amount of plates can be restricted to very low values. For the development of all model components, Wolfram Mathematica has been used, including C++ components for solving the heat transfer equation. PlateMod control was integrated to the automation system of voestalpine Grobblech and has been online for approximately two years.

The distribution of measured mechanical properties of a large plate series shows a smaller standard deviation, and also a smaller difference between its lower and upper tail compared to standard production.

View Presentation
How Symbolic Summation Is Used in Particle Physics Carsten Schneider

In many research areas, counting problems arise that lead to complicated multi-sums. In this regard, symbolic summation, a subfield of computer algebra, enters the game: it helps in the task of simplifying sum expressions such that they are easy to handle for further calculations. In the last 20 years, advanced difference ring algorithms have been developed and implemented within the Mathematica package Sigma, which assists in this task. As it turns out, this toolbox is not only applicable for enumerative problems coming from combinatorics, but it can also be applied to tackle sums that arise in particle physics. Within an intense and close cooperation with the Theory Group of DESY Zeuthen (Johannes Blümlein), complicated three-loop Feynman diagrams with masses are considered that describe certain behaviours of particles. These diagrams and resp. integrals can be transformed into huge expressions up to several GBs in size and in terms of millions of multi-sums. In the last few years, Sigma has been tuned and generalized to a very robust and efficient package and has been supplemented by many further Mathematica packages to perform such gigantic tasks. In this talk, the underlying algorithmic ideas and challenges coming from the field of particle physics are presented by concrete examples.

View Presentation
The Native Wolfram Player for iOS Jason Harris

In this talk, I will demonstrate some of the capabilities of the native Wolfram Player for iOS, which has recently been updated to 11.3, and talk about the improvements in this version. I will demonstrate how to view and interact on your iPad and iPhone with CDF files from your Wolfram Cloud Account, websites, email attachments or other cloud storage using the Wolfram Player for iOS application. I'll demonstrate how to build packages for deployment on this platform, and I will briefly talk about the technology behind the iOS Player and future directions.

View Presentation
SystemModeler and the Wolfram Language Jan Brugård

SystemModeler and the Wolfram Language are used for model-based design in industry and education worldwide. With SystemModeler 5.1, the connection between the two tools was strengthened substantially, making much of the system modelling functionality directly available in the Wolfram Language. This talk will exemplify this and demonstrate with use cases from industry and education.

View Presentation
Generative Design and Programming in Architecture with Mathematica
Philippe Morel

We architects all know that our age is the age of information, or as I have stated in various contexts, the one of computation. Indeed, it is not only the information that took command but also computation, i.e. computational power, resources and abstract mathematical models. Now, these mathematical models, extremely diverse and more numerous, are rarely fully accessible within software packages that are usually used for architectural design. Indeed, the latter do not make it possible to set up really open generative design approaches that designers can control at will and hybridize with each other, or with other approaches, as in machine learning. In this field, the versatility of Mathematica (the Wolfram Language) as a genuine environment of technical computing and multiparadigm programming gives it considerable advantage and power. We will see, from concrete examples, how it is possible to use Mathematica to push the boundaries of architectural design in the direction of increasingly rich generative design approaches.

View Presentation
Educational Robotics with the Wolfram Language
Yves Papegay

Poppy Ergo Jr is a small serial manipulator, designed and widespread for educational purposes. It comes with a multilevel control interface and a Python library. In this presentation, we will demonstrate how we used the Wolfram Language to control this robot and to build several layers of pedagogic activities. Through these activities, fundamentals of mathematics and algorithmics, namely geometrical modelling and open- and closed-loop control, are introduced. We will also discuss motion planning and task planning, as well as image manipulation and sensor data analysis.

View Presentation

Download image gallery