The Session track has two primary goals: (1) to test whether systems can improve their performance for a given query by using previous user interactions with a retrieval system (including previous queries, clicks on ranked results, dwell times, etc.), and (2) to evaluate system performance over an entire query session instead of a single query.

The TREC track has focused on the first goal. The purpose of this site is to evaluate the second goal by simulating user interactions with ranked results.


Please select one of the three options below. Hover over a   icon for more information. Click here for a tutorial.
1 Start a new session run
2 Continue an ongoing session run
Run ID: ?
Interaction round:?
Results file:?
Snippets (optional):?
3 Download a completed session run
Run ID:?