With over 150,000 passengers going through Changi Airport every day, the airport faces the challenge to optimize the visitor experience and match visitors with the right airport amenities and the right information at the right time.
AirPortal opens up automatically when travelers connect to Changi Airport’s Wifi, no need to download an app. Travelers can either scan the barcode on their boarding pass or simply enter their flight number, and they will see the information that is relevant to them at that moment – from up-to-date gate information to a good restaurant. No information overload.
I was the product manager in this team. I designed the information flow and recommendation algorithm. I also managed the app prototype development collaboration between the design and software engineering team members and presented the prototype at the Changi Airport Hackathon in San Francisco, CA.
As part of P&G IT’s innovation effort, we were looking to get a better understanding of the technical challenges of supporting the company’s progressing move toward the Internet of Things. An effort that had originally largely been driven by R&D, IT was now increasingly needed to support updates over the air, a solid cloud infrastructure and data management. The IT department decided to build a functioning IoT prototype to learn more about the challenges and possible solutions.
The concrete prototype was determined to be something that could be used within the office space. After some rounds of brainstorming, we decided to build a room occupancy tracker that would give immediate visibility whether a meeting room was free via an online dashboard.
We first built an IoT device using an Arduino prototyping platform. We then decided to assemble and solder our motherboards ourselves, however. We connected humidity, temperature, motion and brightness sensors, applied the logic directly on the device to save power (we decided to go for a battery-driven solution to have more independence and flexibility) and pushed the data to a database in the cloud, which underlay the dashboard.
Finally, we designed and 3D printed a case for more robust handling.
I initiated the project and was involved in all steps. My main responsibility was communicating our progress (creating blog posts, a promotional video) and presenting the finished prototype to executives of the company.
Integrate real-time bio-sensory data into a virtual reality experience
Leveraging the Mio Alpha Watch as a real-time heart rate sensor, we integrated the ECG signal into a virtual reality environment in Unity by creating a heart object that palpitates to the rhythm of the wearer’s real heart.
The goal of this project was to analyze and visualize where movies were shot in San Francisco. To achieve this goal, information from different data sources was combined, analyzed and visualized, using some of the spatial data analysis concepts learned in Spatial Data Analysis.
The first step in this multi-step process was data gathering. The main datasetused in this project comes from the San Francisco Open Data website and contains a list of movies and TV shows with scenes filmed in San Francisco, some more detailed description of the movie/show like cast and year and – most importantly for the analysis – a textual description of the location where the movie/show was shot. Overall, the dataset contains descriptions of 1586 locations in San Francisco. The dataset was downloaded from the website and imported into Matlab for inspection, cleaning, analysis, and visualization.
Functionality from the Google Geocoding API, the Foursquare API, and the Open Movie Database API was used to get additional data and an external Matlab file exchange program (Google Map Plotter by Zohar Bar-Yehuda) was used to enhance visualization.
For an interactive user experience, a script was created that allowed users to explore the dataset on their own terms. The program prompts users for a location description in San Francisco (eg. ‘Golden Gate Bridge’) or a concrete address (eg. ‘888 Brannan Street’) and a radius in meters within which to search for movie locations. Leveraging the user’s input, the tool then returns a rich variety of information. Most importantly, it returns a map of the location and its immediate surroundings and a list of movies that were shot within the specified radius.
Movies shot here were ‘Burglar’, ‘My Reality’, ‘San Andreas’, and ‘Vertigo.
A second visualization generated was that of a side-by-side plot of movie locations by genre.