Activity Recognition using Context Awareness

Our prime customer was looking for Context-as-a-Service, where context can be sensed, thoroughly understood and based on that further actions can be performed. Context involves – Identity Context (Who am I?), Activity Context (What I am doing?), Time Context (When I am doing?), Location Context (Where I am?). Context-Aware systems can sense their physical environment & adapt their behavior accordingly. The requirement was to develop a library that can sense human activity (standing, walking, sitting, running) and collect data for the user’s current activity and based on the collected data provide the recommendations.

AIM OF THE PROJECT
  • Design and development of Context-Aware Library Framework which is modular, extensible to include various context-aware algorithms and portable on any platform. These algorithms should be accessible to application layer through easy to use APIs
  • Design and development of Activity Recognition Algorithm based on accelerometer sensor input and runs as part of context-aware library
THE CHALLENGE

There were few of the following challenges that we came across developing this system more accurate:

  • Collecting data for user activity as each individual has different pattern of doing any activity
  • As working environment can impact the user pattern of doing an activity, it was a challenge to write an algorithm for collecting and analyzing data for similar kind of activity in different environments
  • As each user has different speed of performing an activity, it was also a challenge to decide how frequent a system should collect the data
FEATURES
  • Highly Modular & Extensible Context-Aware Library written in C
    Capable of integrating multiple context aware algorithms and sensors easily
  • Set of intuitive APIs available which allows easy integration
  • Low memory footprint
  • Portable on any platform easily
  • Supported Sensors: Accelerometer, Gyroscope, Magnetometer, Microphone, Humidity, Pressure, IR temperature sensors
  • Supported Algorithms: Activity recognition, Posture detection, Sleep detection, Fall & Step detection
  • Support for basic activity classification like Resting, Walking, Running, Stair Climbing (up or down), Standing, Fall Detection
  • Library sends the classified output on wearable device display
  • Accuracy greater than 85% achieved
  • Takes accelerometer data as input
  • Algorithm can be optimized for any user’s position/orientations
 
VOLANSYS’ ROLE
  • Designed & Developed Context-Aware Library Framework
  • Designed & Developed Algorithms for Activity Classification, Posture detection, Sleep detection, Fall & Step detection
  • Porting the Library and Algorithm on microcontroller based wearable platform from TI