We aim to digitize smell. Achieving this is currently prevented by gaps in basic science. We aim to fill these gaps, culminating in a proof of concept for our model. The primary gap we identify is lack of data on what humans typically smell. Phrased conceptually, in Aim 1 we ask what are the natural statistics of human olfactory perceptual space. We address this in a series of three experiments, highlighted by one where we equip participants with a wearable sampling apparatus we designed and built for this proposal. The apparatus measures sniffing behaviour to identify odor sampling, measures neural activity to verify olfactory perception, takes video of the visual scene, analyses total levels of volatile organic compounds in real time, and collects odorant samples for detailed analysis off line. In other words, we generate an olfactory equivalent of Google Street View, with the addition of chemical, perceptual and neural data. Using this we will characterise the natural statistics of human olfactory perceptual space. Moreover, a major contribution of this proposal will be in posting this massive data as a publicly available recourse. Next, in Aim 2 we build on this data to digitize human olfactory perceptual space. We put forth a model that
allows us to recreate odors using a restricted set of odor primaries. We will test our model in two frameworks: One we call SmelloVision, where we develop the algorithmic framework to generate an odor to match digital images, and one we call TelleSmell, where we use a device to sense the environment, an algorithmic framework to transfer the data, and a device to generate the corresponding odor remotely. We provide pilot data for Aim 2 where we sensed an odor in Mainz (Germany), transmitted the data over IP to Rehovot (Israel), where we successfully recreated the smell. This was, as far as we know, the first transmission of odor over IP.