I-SEARCH is an FP7 specific targeted research project aiming to create the first search engine able to handle multimedia and multimodal content (text, 2D image, sketch, video, 3D objects, audio and combination of the above), which can be used as queries and retrieve any available relevant content of any of the aforementioned types.
I-SEARCH will overcome the limitations of current content-based multimedia retrieval methods through the realization of a novel Rich Unified Content Description (RUCoD), which will integrate descriptions of all of the above types of content, real-world information (GPS, temperature, time, weather sensors, RFID objects), emotional cues and social descriptors, in order to better express what the user wants to retrieve. I-SEARCH will develop a novel generation of multimodal search engines providing users with natural and expressive interfaces.
Additionally, I-SEARCH will propose novel solutions for relevance feedback, based on users’ social behaviour and recommendations. The above will result in a highly user-centric search engine, able to deliver to the end-users only the content of interest, satisfying their information needs and preferences. I-SEARCH will also introduce efficient tools for visualising the search results in order to enhance the presentation layer of search engines. Several aspects, such as user profile, end-user terminal, available network bandwidth, interaction modality preference, will be taken into account to achieve the optimal presentation result. Finally, the search engine will be dynamically adapted to end-user’s device, which will vary from a simple mobile phone to a high-performance PC.
ITI is the coordinator of the I-SEARCH project. Except for the project management, ITI will be involved in extraction of low-level descriptors for 3D content and real-world descriptors, RUCoD specification, relevance feedback, multimodal annotation propagation and visual analytics.