In the context of elderly people health-care in a domestic environment, being able for a robot to understand the pose of a person in a scene is essential. Having the position and orientation of every member of a human being can lead to extract higher level information such as the type of action being performed or the identification of unsafe positions. The proposed algorithm is based on a widely used method called Iterative Closest Point and uses three-dimensional data. The algorithm developed in the project, called Constrained Articulated-ICP, is a variant of this method. It is adapted to the articulated properties of a human body. Several techniques have hence been developed in order to improve the pose recognition, as well as to ensure its validity. A human model is created and the orientation of every member is found by iterative matchings. The pose information is then directly extracted thanks to this model. Developing the algorithm for a real-time application, the pose tracking proved to be accurate and stable. The performed evaluation highlights its ability to recognise the human pose in motion and while performing everyday life movements such as sitting or walking.