This paper describes a novel system for detecting and classifying human activities based on a multi-sensor approach. The aim of this research is to create a loosely structured environment, where activity is constantly monitored and automatically classified, transparently to the subjects who are observed. The system uses four calibrated cameras installed in the room which is being monitored and a body-mounted wireless accelerometer on each person, exploiting the features of different sensors to maximize recognition accuracy, improve scalability and reliability. The algorithms on which the system is based, as well as its structure, are aimed at analyzing and classifying complex movements (like walking, sitting, jumping, running, falling, etc.) of potentially multiple people at the same time. Here, we describe a preliminary application, in which action classification is mostly aimed at detecting falls. Several instances of a hybrid classifier based on Support Vector Machines and Hierarchical Temporal Memories, a recent bio-inspired computational paradigm, are used to detect potentially dangerous activities of each person in the environment. If such an activity is detected and if the person “in danger” is wearing the accelerometer, the system localizes and activates it to receive data and then performs a more reliable fall detection using a specifically trained classifier. The opportunity to turn on the accelerometer on-demand makes it possible to extend its battery life. Besides and beyond surveillance, this system could also be used for the assessment of the degree of independence of elderly people or, in rehabilitation, to assist patients during recovery.