FireStats is not installed in the database
Laryngitis homøopatiske midler køb propecia edderkop bid retsmidler
The goal of the MPLab is to develop systems that perceive and interact with humans in real time using natural communication channels. To this effect we are developing perceptual primitives to detect and track human faces and to recognize facial expressions. We are also developing algorithms for robots that develop and learn to interact with people on their own. Applications include personal robots, perceptive tutoring systems, and system for clinical assessment, monitoring, and intervention.

  • Introduction to the MPLab (PDF)
  • MPLAB 5 Year Progress Report (PDF)

  • NEWS

    Georgios at Intel gave me a heads up about this teaching robot in Japan:\03\06\story_6-3-2009_pg9_9

    A buzzword I got from the complete intelligence workshop was “Markov Logic”. Here is an easy to read intro paper:

    Here is a new journal that may be a good venue for our work

    DATE: Wed, March 4
    TIME: 4:30-6
    PLACE: CSB 003
    SPEAKER: Chris Johnson, Dept of Cognitive Science, UCSD

    Negotiating Carries: Gesture Development in Mother-Infant Bonobos
    C Johnson, S-L Zastrow and M. Halina


    The emergence of gesture in captive bonobos (Pan paniscus) was investigated in longitudinal case studies of three mother-infant dyads at the San Diego Zoo and Wild Animal Park. Videotape, shot on site over the period in which the infants were 10-18 months of age, was reviewed and about 700 examples of attempted or completed carries were collected. Criteria for determining when an interaction was “carryesque”, even when it did not end in a carry, involved identifying a normative, dynamic configuration of co-action and co-attention. The relative moves in these trajectories were further analyzed as being compatible or incompatible with a carry, and as involving configuring one’s own body, manipulating the body of another, or gesturing. Defining gesture as any other-directed, non-forceful, non-carry-enacting move in context, we were able to identify two major classes of gestures during carries in these dyads: attention-getting and iconic. The performance of these gestures showed a marked sensitivity to the attentional state of the other animal. The social ecology of each dyad – situating them in the larger political setting of their respective social groups, and including each dyad’s propensities for compatibility, manipulation, etc. – is argued as exerting selective pressure for certain types of mother-infant negotiations. Of particular interest was one dyad in which a period of both high incompatibility by the mother, and high levels of compatible initiation and manipulation by the infant, resulted in greatly extended negotiations in which carry-specific gestures emerged. Additional preliminary micro-analysis of these gesture-mediated interactions and their predecessors suggests that this development may have involved the salient “freezing” of a normally-continuous, role-specific enactment of the carry, and ultimately the generation of a dissociable gesture routine.

    Christine M. Johnson, Ph.D.
    Department of Cognitive Science
    U.C. San Diego
    La Jolla, CA 92093-0515
    Phn: 858-534-9854
    Fax: 858-534-1128

    The UCSD Department of Cognitive Science is pleased to announce a talk by

    Bilge Mutlu

    Carnegie Mellon University

    Monday, February 23, 2009 at 12pm
    Cognitive Science Building, room 003

    “Designing Socially Interactive Systems”

    Recent advances in artificial intelligence and speech recognition have enabled a new genre of computer interfaces that promise social and cognitive assistance in our day-to-day lives. Humanlike robots, one family of such interfaces, might someday provide social and informational services such as storytelling, educational assistance, companionship using complex, adaptive real-world interactions. In my research, I harness existing knowledge of human cognitive and communicative mechanisms and generate new knowledge in order to design these systems such that they more effectively yield social and cognitive benefits. In this talk, I will present a theoretically and empirically grounded framework for designing social behavior for interactive systems. This process draws on theories of social cognition and communication and formal qualitative and quantitative observations of human behavior, and produces computational models of social behavior that can be enacted by interactive systems. I will present a series of empirical studies that demonstrate how this framework might be used to design social gaze behaviors for humanlike robots and how participants show social and cognitive improvements particularly, better recall of information, more conversational participation, and stronger rapport and attribution of intentionality led by theoretically based manipulations in the designed gaze behavior. I will also present a vision for future work in this area that provides a framework for interdisciplinary research, drawing on knowledge from and contributing to research in social cognition, human-computer interaction, machine learning, and computational linguistics.

    Reminder: proposals are due March 1st, 2009.


    Boston MA, USA
    2-6 November 2009
    Special sessions in main conference: 2-4 November 2009

    ********* Special Session Proposal Deadline: 1 March 2009 *******

    Acceptance Notification: 22 March 2009

    The ICMI and MLMI conferences will jointly take place in the Boston
    area during November 2-6, 2009. The main aim of ICMI-MLMI 2009 is to
    further scientific research within the broad field of multimodal
    interaction, methods and systems. The joint conference will focus on
    major trends and challenges in this area, and work to identify a
    roadmap for future research and commercial success. The main
    conference will include a number of sessions. Each special session
    should provide an overview of the state-of-the-art, present novel
    methodologies, and highlight important research directions in a field
    of special interest to ICMI participants. Topics of special sessions
    should be focused rather than defined broadly.

    Each special session should comprise of 4-5 invited papers. It is
    encouraged that the session begins with an overview paper on the topic
    being addressed and that the remaining papers follow up with technical
    contributions on the topic.

    The following information should be included in the proposal:

    * Title of the proposed special session
    * Names/affiliation of the organizers (including brief bio and
    contact info)
    * Session abstract (state significance of the topic and the
    rationale for the proposed session)
    * List of invited presenters (including a tentative title and a
    300-words abstract for each paper)

    Proposals will be evaluated based on the timeliness of the topic and
    relevance to ICMI, the potential impact of the sessions, the quality of
    the proposed content, and the standing of the organizers.

    Please note that all papers in the proposed session should be reviewed
    to ensure that the contributions are of the highest quality. The
    organizer(s) of accepted special sessions will arrange the review
    process, except that the review of papers submitted by the organizers
    themselves will be handled by the special session and program co-chairs.
    Once all the papers belonging to a special session are reviewed, the
    final acceptance of the session will be based on submitting the whole
    package to the program co-chairs.

    Important Dates for Special Session Submission:

    1 March 2009: Proposal for special sessions due.
    22 March 2009: Decision for special session proposal due.
    29 May 2009: Submission of special session papers to organizers.
    15 July 2009: Acceptance notification.
    1 August 2009: Submission of the whole package and final versions of papers

    To submit special session proposals (as pdf) or for additional information
    regarding the special sessions, please email

    « go backkeep looking »follow the MPLab on Twitter