Thank you to all the participants and speakers for making the 2018 European Wolfram Technology Conference a success.
Datasets coming from modern surveys are basically made by a very long list of coordinates of points (a "point cloud") detected on the surface of the object under study. Each point can be associated to some local feature: color (photogrammetric survey), reflectance (an index of material reflection from a laser scanner survey) or orientation (average normal to the closest faces of the triangulated surface through the points), but point coordinates are often enough to evaluate its 3D geometrical model and feature.
Given such a discrete point cloud representation of a surveyed object, it is then possible to analyze its three-dimensional features with the help of several mathematical models: parametric surfaces and curves, planar sections, segmentation or matching procedures and tessellation.
We present a few applications of original algorithms, written in Mathematica, to specific cases of interest in architecture and cultural heritage, from the detection of an optimal fluting column model in the virtual reconstruction of the Arch of Titus at the Circus Maximus in Rome to the contact probability of two possibly adjacent fragments in the recomposition and restoration of the fragmented S. Andrea statue at Stiffe, L'Aquila, the shape and tessellation of Borromini's San Carlino dome or the bricks detection and analysis of the masonry type of Villa dei Misteri in Pompeii.
Boolean networks ("Bool nets") are sequential, dynamical systems introduced by S. Kauffman for modelling biological genetic systems. They are a generalisation of cellular automata.
In this talk, we illustrate two alternative ways—shared-state variables and shared labeled transitions—in which two Bool nets P and Q can be composed to form a new system with a user-controlled degree of coupling.
Theorema is a system aimed at the support of mathematical theory exploration. The core activities in theory exploration are defining new mathematical concepts and proving their properties. The main focus of Theorema is on human-like style in both input and output. This means that mathematical input should appear as though written in a good math textbook, and proofs generated by the system should read as written by a smart student.
Theorema uses programmable expression syntax in Mathematica to define its own version of a mathematical language based on sets, tuples and common constructs of predicate logic. Customized cascading stylesheets allow one to compose entire theories in notebooks with enhanced functionality and appealing style. The reasoning engine is based on a deduction calculus in natural style making heavy use of syntactic and semantic pattern matching as available in Mathematica for decades. Theorema is developed as a standard Mathematica add-on package and is available free of charge under a GNU GPL license.
The applications of the Theorema system reach from higher-level education to research mathematics. The more computational thinking brings algorithms into the classrooms, the more will correctness of algorithms become one of the key topics for future generations of students. Formal trusted proving will be the enabling technique in this area—but also for pure mathematics, the significance of formal reasoning will increase. The Theorema system has recently been used for a formalization of second-price auctions in theoretical economics.
Financial and economic systems require a model of agent behaviour. In many cases, these models have been known and systematically proven to be inherently naive but useful, for example through tractability or transparency. As more applications require increased use of complex automated decisioning and as data complexity increases, many of these simplifications become unacceptable. Behavioural science applications are becoming more widely adopted in fields such as surveillance and intelligent decisioning. This is an important application for which enterprise behavioural analytics, combined with enterprise dynamics, enables deeper applied use cases to be implemented. This is also a multi-industry opportunity.
EBA has recently been integrated into Nasdaq's trade surveillance armoury, and is in the process of being applied to AI applications in medtech and agritech.
This presentation outlines the logic associated with the development of EBA and poses future use cases and areas of research.
The Wolfram Language is the perfect tool for rapidly building prototypes for industrial applications: its symbolic nature allows one to quickly capture almost any problem, concept or new idea in a succinct, unified way, making it available for computation. The wide range of functions and built-in knowledge allow the developer to concentrate on the goals he wants to achieve, without wasting time on technical details. Mathematica's notebook interface and visualization capabilities foster iterative development and short feedback loops.
In this talk, we will show real-world examples of industrial application prototypes we developed during the past few years in very different areas, such as electrical engineering (designing transformers), material science (density functional theory) and computational finance. Our focus is on highlighting the path from the idea to the solution, with surprisingly few lines of code.
View PresentationAfter 30 years of development at the frontier of research, Wolfram's technology now offers a wide range of functionalities, many of which offer hints for original and novel educational materials. Mathematical knowledge is just the starting point. Adding interactive interfaces, powerful visualisation features and databases of rich information from almost any known discipline provide unique potential for attractive lessons at any educational level. Considering that all these features can be exploited on a computer, a tablet or a mobile phone and can run standalone or in the cloud, there is no excuse to continue without them.
This talk will show examples of how to approach different subjects using some of the most powerful features of the Wolfram Language for different levels of education. Kids can improve their scientific intuition by working with interactive demos on easy concepts and/or small games based, for example, on simple text analysis or even on geometric manipulations. Middle-school students can start to work out with coding or even use a little more complex functionality, like data exploration in English. In high schools, teachers can use the Wolfram Language to deal with mathematics, science, biology, chemistry and even geography or literature. Finally, at the university level, there is nothing to add. Any subject can benefit from all functionalities available in Mathematica, Wolfram|Alpha, SystemModeler or any other Wolfram technology.
View PresentationSecondary cosmic rays (often referred to just as "cosmic rays") constitute a continuous flux of particles produced at the top of the atmosphere, and mainly consist of muons reaching every point on the Earth's surface with quite consistent statistical properties.
These freely available and abundant particles are extremely valuable for testing/calibrating charged particle detectors (even the largest ones at CERN), in combination with a Monte Carlo simulation of the apparatus (usually in C++/Geant4), including a realistic cosmic ray joint distribution of theta/phi/energy (in the literature).
The straightforward way to perform these simulations is by randomly generating such cosmic rays in a horizontal plane above the detectors and following their way through them. But since several rays are tilted, it is essential to generate them on a surface much larger than the apparatus area, resulting in most of the tracked rays being "wasted."
To overcome this issue, we've developed a new approach: we've implemented a model where they are generated on a half-sphere enclosing the apparatus, with a convenient distribution of "source points" and a suitable angular/energy distribution, so that for every point inside the half-sphere all the cosmic rays' statistical properties hold.
The conversion from the usual "planar" three-variate PDF to the "half-spherical" five-variate PDF needs tricky analytical calculations that have been performed with Mathematica. The new distribution was eventually implemented in the C++ code to check for consistency with the usual planar-source results.
The "half-sphere" method allows a tenfold reduction in the Monte Carlo computational time with no detriment to accuracy.
View PresentationThe thermomechanical processing of heavy steel plates starts with the reheating of the slabs, followed by hot rolling and accelerated cooling, and completes with levelling. In recent years, voestalpine Grobblech in Linz has made great efforts to model the entire production chain with physically based models. Together with prediction tools for the mechanical properties, advanced process automation strategies are developed and applied. This strategy uses dynamic adaption of target values and is named "PlateMod control." With this method, the standard deviation of mechanical properties (e.g. tensile strength) of a series with a large amount of plates can be restricted to very low values. For the development of all model components, Wolfram Mathematica has been used, including C++ components for solving the heat transfer equation. PlateMod control was integrated to the automation system of voestalpine Grobblech and has been online for approximately two years.
The distribution of measured mechanical properties of a large plate series shows a smaller standard deviation, and also a smaller difference between its lower and upper tail compared to standard production.
View PresentationIn many research areas, counting problems arise that lead to complicated multi-sums. In this regard, symbolic summation, a subfield of computer algebra, enters the game: it helps in the task of simplifying sum expressions such that they are easy to handle for further calculations. In the last 20 years, advanced difference ring algorithms have been developed and implemented within the Mathematica package Sigma, which assists in this task. As it turns out, this toolbox is not only applicable for enumerative problems coming from combinatorics, but it can also be applied to tackle sums that arise in particle physics. Within an intense and close cooperation with the Theory Group of DESY Zeuthen (Johannes Blümlein), complicated three-loop Feynman diagrams with masses are considered that describe certain behaviours of particles. These diagrams and resp. integrals can be transformed into huge expressions up to several GBs in size and in terms of millions of multi-sums. In the last few years, Sigma has been tuned and generalized to a very robust and efficient package and has been supplemented by many further Mathematica packages to perform such gigantic tasks. In this talk, the underlying algorithmic ideas and challenges coming from the field of particle physics are presented by concrete examples.
View PresentationIn this talk, I will demonstrate some of the capabilities of the native Wolfram Player for iOS, which has recently been updated to 11.3, and talk about the improvements in this version. I will demonstrate how to view and interact on your iPad and iPhone with CDF files from your Wolfram Cloud Account, websites, email attachments or other cloud storage using the Wolfram Player for iOS application. I'll demonstrate how to build packages for deployment on this platform, and I will briefly talk about the technology behind the iOS Player and future directions.
View PresentationSystemModeler and the Wolfram Language are used for model-based design in industry and education worldwide. With SystemModeler 5.1, the connection between the two tools was strengthened substantially, making much of the system modelling functionality directly available in the Wolfram Language. This talk will exemplify this and demonstrate with use cases from industry and education.
View PresentationWe architects all know that our age is the age of information, or as I have stated in various contexts, the one of computation. Indeed, it is not only the information that took command but also computation, i.e. computational power, resources and abstract mathematical models. Now, these mathematical models, extremely diverse and more numerous, are rarely fully accessible within software packages that are usually used for architectural design. Indeed, the latter do not make it possible to set up really open generative design approaches that designers can control at will and hybridize with each other, or with other approaches, as in machine learning. In this field, the versatility of Mathematica (the Wolfram Language) as a genuine environment of technical computing and multiparadigm programming gives it considerable advantage and power. We will see, from concrete examples, how it is possible to use Mathematica to push the boundaries of architectural design in the direction of increasingly rich generative design approaches.
View PresentationPoppy Ergo Jr is a small serial manipulator, designed and widespread for educational purposes. It comes with a multilevel control interface and a Python library. In this presentation, we will demonstrate how we used the Wolfram Language to control this robot and to build several layers of pedagogic activities. Through these activities, fundamentals of mathematics and algorithmics, namely geometrical modelling and open- and closed-loop control, are introduced. We will also discuss motion planning and task planning, as well as image manipulation and sensor data analysis.
View Presentation