Imagine you are in a university residence, where you can meet a great amount of people. NTMY, giving you the opportunity to organise thematic events, will help you to meet new friends!
The user can interact with the system using a wearable and a smartphone. With the app you can create a new social event in the residence (like watching a football match together on TV or play a card game). The system will then set a specific location for the event (considering its type and the free rooms) and it will assign it a colour; the user’s wearable will turn that colour too (e.g. blue).
Whenever a new event is created, people in the residence with the same interests get a notification and, if they decide to take part to the blue event, their wearables will become blue as well. The lights in the reserved room will also become blue to indicate that the room will host the blue event, and in the main streets of the residence some indicators will light up in blue, in order to help the participants to easily find the event's location.
The system also acts according to the category of the meeting (e.g. TV event, music event, game event, discussion event), performing actions on the surrounding environment such as adjusting the lights, the video (TV) and the audio facilities of the room.
Then, when the user is at the event and greets someone, the system will recognise his action and register the personal information of all the people he met (such as name, photo, phone number, Facebook/Instagram, ...) on his mobile, so that at the end of the day he can remember them. No more forgotten names or time lost in exchanging phone numbers!
NTMY is...
...sensitive, because it will detect when you are shaking someone's hand.
...responsive, because it will suggest you the events that fit with your interests.
...adaptive, because it will adapt the environment according to the scheduled event.
...transparent, because the interaction is as natural as a handshake.
...ubiquitous, because you can interact with it everywhere in order to find the way to your event.
...intelligent, because it will smartly allocate the available rooms according to the types of events.
The goal of the project is to build a system that can be used in university residences to smartly organize event in order to let people meet others with the same interests. For this purpose, it makes use of different kinds of electronic devices such as PCs, mini computers, smartphones and wearables, all of which contribute to solve different problems that can be experienced during the organization of an event.
The system can improve the management of the rooms in the residence, allocating events in order to maximize the number of participants and find the most suitable room for any kind of event. This process is carried on the central server, that will run an optimization algotithm and generate, for each day, the best events schedule to maximize a number of different target functions such as the number of participants.
Once the event has been scheduled, the system will also help in the phase of the room preparation. Every room will be equipped with a Raspberry computer that will interface with the central server and prepare the ambient in order to make it as enjoyable as possible. It will be able to intelligently control the lights, the TV and the audio systems according to the category of the event choosen and to the number of users taking part to it.
Smart panels will be put in the intersection of the residence, to help the participants to find their way to the events. Whenever an user passes near one of them, the system will identify his wearable (via Bluetooth) and indicate the right direction to reach the specific event room. The panel is controlled by a Raspberry computer, that will dinamically search for the direction making use of a shortest path algorithm on a graph representing the map of the residence.
Finally, the last problem our system aims to solve is related to the personal information that people share when they first meet, that are often not so easy to remember. In order not to spoil the simplicity of the interaction, our system makes use of an Android Wear smartwatch capable of recognizing the users when they handshake, informing the server that a connection has been made; users can then check the information on the people they met on their phone.
# | Priority | Description |
---|---|---|
RO1 | 1 | Smart allocation of the rooms to the events. |
RO2 | 1 | Adjustment of the color of the room's lights to indicate the booking of the room. |
RO3 | 2 | Setting of the TV program, if required by the event. |
RO4 | 2 | Music reproduction to fit the event's category. |
RO5 | 3 | Dynamic generation of video and audio playlist based on the tastes of the users in the room. |
# | Priority | Description |
---|---|---|
SA1 | 1 | Listing of all the scheduled events. |
SA2 | 1 | Visualization of all the data collected about the people the user met. |
SA3 | 1 | Creation of a new event. |
SA4 | 1 | Joining of an event. |
SA5 | 3 | Sending notifications about new events, according to the user's tastes. |
# | Priority | Description |
---|---|---|
WR1 | 1 | Setting of the watchface colour based on the event's colour. |
WR2 | 1 | Activation of the data transfer following an handshake. |
WR3 | 2 | Activation of the smart panels when in communication range. |
# | Priority | Description |
---|---|---|
RS1 | 1 | Detection of the presence of a user nearby. |
RS2 | 1 | Providing of the right direction to the user, lighting up its leds forming an arrow sign. |
Hardware architecture: The central server will be implemented with a PC with access to the LAN of the residence. It will be used to store and retrieve the data for both the users and the events.
Software architecture: The central server will run a Python application which will smartly assigns rooms to each event, calculate the route from user's room to the one of the event and generate movie and music playlists according to the users partecipating in that event. It will expose an API that will be used both by the smartphones and the room clients based on Flask and SQLite as the database backend.
Network architecture: The central server will be reachable via LAN or WLAN in every room of the residence and will use a static IP address so that every device on the LAN will know how the URI to query to access the exposed APIs.
Hardware architecture: Every room will be managed by a Raspberry Pi client, with the role of controlling all the smart devices in the room, based on the information exchanged with the central server.
Software architecture: The room clients will run a Python application with the goal of adjusting lights, TV and audio to fit the category of the event and adapts the ambient characteristics dinamically. It will communicate with the server via the room API and will be able to control a number of different smart devices.
Network architecture: The room clients will be connected to the LAN via a wired or wireless connection (no need for static IP, rooms will have an unique identifier to be recognised by the server) and will use different protocols to control the smart devices around it.
Hardware architecture: Any kind of Android device can be used to interface with the system via the app, provided that it has Bluetooth to connect to the wearable and WiFi to connect to the server.
Software architecture: The smartphone app will be based on Android API level 23 (Android 6.0) and will use the Android Bluetooth interface for point-to-point communications among devices. It will interface with the users API of the server and will provide events management functions (create/join) and show information about the people the users meet at the events.
Network architecture: The users' smartphones will use Bluetooth to communicate with their wearables and WiFi to access the server LAN. They will also use Bluetooth to exchange data among users following a handshake.
Hardware architecture: Wearables will be Android Wear devices, used to recognise handshakes among the users and indicate the event the user is going to take part in.
Software architecture: The wearable application will be based on Android Wear 2.0 API and will coordinate with the smartphone app to show on the device information about the event (like the event's colour) and activate the data transmission recognising the handshake gesture.
Network architecture: The wearables will be connected via Bluetooth to the smartphones and will transparently (via the Wear API) use their WiFi connection if needed.
Hardware architecture: The Smart Road Signs will be managed by a Raspberry Pi board with a LED matrix connected to the GPIO pins.
Software architecture: The sign controller will run a Python application to interface with the server and take info on events' location, look for nearby users and control GPIO pins in order to light up the LEDs.
Network architecture: The sign controller will use a wireless link to communicate with the server and will use Bluetooth connection to identify the users nearby and provide him the correct direction.
The smart sign will be implemented with 8 RGB LEDs, driven by the Raspberry board, that will form an arrow pointing the direction to the event when the user get close.
We will make use of Philips Hue lights driven by the room clients to control the lighting of the rooms where the events will be held.
We will make use of a monitor driven by the room clients as a television system for the events that require this kind of facility.
We will make use of a couple of stereo speakers, wired or wireless, driven by the room clients as an audio system for the events that require this kind of facility.
The following is a list of the open issues our project still needs to face: