<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0"><channel><description>DiLi lab at the Department of Computational Linguistics, University of Zurich. 👀🤖📖🧠💬 https://www.cl.uzh.ch/en/research-groups/digital-linguistics.html</description><link>https://bsky.app/profile/dili-lab.bsky.social</link><title>@dili-lab.bsky.social - Digital Linguistics Lab</title><item><link>https://bsky.app/profile/dili-lab.bsky.social/post/3mgowtgqawc26</link><description>EyeBench is a mega-project providing a much needed infrastructure for easy loading &amp; preprocessing of eye-tracking-for-reading datasets, and for addressing super exciting modeling challenges: decoding linguistic knowledge and reading interactions from gaze!&#xA;&#xA;Check it out here: eyebench.github.io</description><pubDate>10 Mar 2026 08:20 +0000</pubDate><guid isPermaLink="false">at://did:plc:e2gupj2qxvk5owfvpuq64u5b/app.bsky.feed.post/3mgowtgqawc26</guid></item><item><link>https://bsky.app/profile/dili-lab.bsky.social/post/3mgmxijubrc2s</link><description>👀 📣 To all users of eye-tracking-while-reading datasets: check out our comprehensive, filterable dataset overview!&#xA;&#xA;Dataset overview: dili-lab.github.io/datasets.html&#xA;&#xA;Preprint: arxiv.org/abs/2602.19598&#xA;&#xA;Add or edit your dataset: https://www.cl.uzh.ch/en/research-groups/digital-linguistics/resources/dataset-review.html&#xA;&#xA;#FAIR #eyetracking #datasets&#xA;https://dili-lab.github.io/datasets.html</description><pubDate>09 Mar 2026 13:27 +0000</pubDate><guid isPermaLink="false">at://did:plc:e2gupj2qxvk5owfvpuq64u5b/app.bsky.feed.post/3mgmxijubrc2s</guid></item><item><link>https://bsky.app/profile/dili-lab.bsky.social/post/3llc5etmhs22w</link><description>Excited to share that our group will present 9 papers at this year&#39;s ACM Symposium on Eye Tracking Research &amp; Applications (ETRA) in Tokyo!&#xA;&#xA;We will post summaries of each paper in the coming weeks, but here&#39;s a quick sneak peek 👀</description><pubDate>26 Mar 2025 15:53 +0000</pubDate><guid isPermaLink="false">at://did:plc:e2gupj2qxvk5owfvpuq64u5b/app.bsky.feed.post/3llc5etmhs22w</guid></item><item><link>https://bsky.app/profile/dili-lab.bsky.social/post/3ll6zax7cps2y</link><description>At this year&#39;s ACL in Vienna, @lenajaeger.bsky.social and David Reich from our group, together with @whylikethis.bsky.social and Omer Shubi, will be hosting a tutorial on EyeTracking and NLP 👀 🖥️ Be there to join us!&#xA;&#xA;More information can be found here: acl2025-eyetracking-and-nlp.github.io&#xA;https://acl2025-eyetracking-and-nlp.github.io/</description><pubDate>25 Mar 2025 10:01 +0000</pubDate><guid isPermaLink="false">at://did:plc:e2gupj2qxvk5owfvpuq64u5b/app.bsky.feed.post/3ll6zax7cps2y</guid></item><item><link>https://bsky.app/profile/dili-lab.bsky.social/post/3ll4w2q6dck2y</link><description>Hello Bluesky world! This is our very first post from the Digital Linguistics (DiLi) team — and we’re kicking things off with exciting news:&#xA;&#xA;🖥️ 🧠 We are pleased to announce that the abstract submission for the first Computational Psycholinguistics Meeting 2025 is now open!</description><pubDate>24 Mar 2025 13:58 +0000</pubDate><guid isPermaLink="false">at://did:plc:e2gupj2qxvk5owfvpuq64u5b/app.bsky.feed.post/3ll4w2q6dck2y</guid></item></channel></rss>