<?xml version="1.0"?>
<News hasArchived="false" page="1" pageCount="1" pageSize="10" timestamp="Thu, 23 Apr 2026 09:17:07 -0400" url="https://dev.my.umbc.edu/groups/umbc-ai/posts.xml?tag=vlm">
  <NewsItem contentIssues="false" id="154576" important="false" status="posted" url="https://dev.my.umbc.edu/groups/umbc-ai/posts/154576">
    <Title>Talk: Wikipedia from the World: Grounded Articles from Any Source, 11/24</Title>
    <Tagline>4-5:15 pm EST Monday, Nov. 24, 2025 in ITE 229 &amp; Online</Tagline>
    <Body>
      <![CDATA[
          <div class="html-content"><h5>Wikipedia from the World: Grounded Articles from Any Source</h5><h5>Alexander Martin, JHU</h5><div><strong>4-5:15pm EST Monday, Nov. 24 in ITE229 and</strong> <a href="https://meet.google.com/dgs-edxk-cfq" rel="nofollow external" class="bo"><strong>online</strong></a></div><div><br></div><div>Whether tracking emerging events, analyzing economic trends, or understanding public discourse, valuable information is scattered across modalities, from professionally produced news content and curated Wikipedia articles to firsthand footage of disasters livestreamed on social media. Building systems that can effectively retrieve, reason over, and synthesize these heterogeneous information sources is essential for knowledge-intensive applications.</div><div><br></div><div>This talk will focus on advancing both sides of the information-seeking pipeline: retrieving relevant multimodal evidence at scale, and synthesizing that evidence into coherent, Wikipedia-style explanations grounded in verifiable evidence. For retrieval, we will focus on recent progress in large-scale <a href="https://www.amazon.science/blog/using-generative-ai-to-do-multimodal-information-retrieval" rel="nofollow external" class="bo">multimodal retrieval</a>, including new dataset, efficient and scalable first-stage retrieves, and reasoning reranking. In Wikipedia-style article generation, we will cover benchmarking and evaluating multimodal article generation and a method for enabling the use of <a href="https://www.nvidia.com/en-us/glossary/vision-language-models/" rel="nofollow external" class="bo">VLMs</a> for high-level reasoning. Together, these components outline a path toward unified systems capable of transforming large collections of multimodal evidence into verifiable, human-readable articles.</div><div><br></div><div><a href="https://alexmartin1722.github.io/" rel="nofollow external" class="bo"><strong>Alexander Martin</strong></a> is a PhD candidate at Johns Hopkins University’s Center for Language and Speech Processing (<a href="https://www.clsp.jhu.edu/" rel="nofollow external" class="bo">CLSP</a>) and Human Language Technology Center of Excellence (<a href="https://hltcoe.jhu.edu/" rel="nofollow external" class="bo">HLTCOE</a>). He is advised by Dr. Benjamin Van Durme. Alex’s research focuses on end-to-end multimodal information retrieval and reasoning. His work aims to produce Wikipedia-style articles, grounded in retrieved documents and videos, in response to information seeking queries. His research has been published in CVPR, ACL, NAACL, and EMNLP. Alex is a recipient of the NSF’s Graduate Research Fellowship.</div><div><br></div><div>Hosted by Prof. <a href="https://www.tejasgokhale.com/" rel="nofollow external" class="bo">Tejas Gokhale</a> at UMBC ITE 229 and <a href="https://meet.google.com/dgs-edxk-cfq" rel="nofollow external" class="bo">online</a>.</div></div>
      ]]>
    </Body>
    <Summary>Wikipedia from the World: Grounded Articles from Any Source  Alexander Martin, JHU  4-5:15pm EST Monday, Nov. 24 in ITE229 and online     Whether tracking emerging events, analyzing economic...</Summary>
    <Website>https://www.tejasgokhale.com/seminar.html</Website>
    <TrackingUrl>https://dev.my.umbc.edu/api/v0/pixel/news/154576/guest@my.umbc.edu/44a9d69937b5e480a80dbfeebb8f361b/api/pixel</TrackingUrl>
    <Tag>genai</Tag>
    <Tag>information-retrieval</Tag>
    <Tag>llm</Tag>
    <Tag>multimodal</Tag>
    <Tag>nlp</Tag>
    <Tag>talk</Tag>
    <Tag>vlm</Tag>
    <Tag>wikipedia</Tag>
    <Group token="umbc-ai">UMBC AI</Group>
    <GroupUrl>https://dev.my.umbc.edu/groups/umbc-ai</GroupUrl>
    <AvatarUrl>https://assets4-dev.my.umbc.edu/system/shared/avatars/groups/000/002/081/cfb27ebe008c2636486089a759ea5c36/xsmall.png?1691095779</AvatarUrl>
    <AvatarUrl size="original">https://assets2-dev.my.umbc.edu/system/shared/avatars/groups/000/002/081/cfb27ebe008c2636486089a759ea5c36/original.png?1691095779</AvatarUrl>
    <AvatarUrl size="xxlarge">https://assets1-dev.my.umbc.edu/system/shared/avatars/groups/000/002/081/cfb27ebe008c2636486089a759ea5c36/xxlarge.png?1691095779</AvatarUrl>
    <AvatarUrl size="xlarge">https://assets1-dev.my.umbc.edu/system/shared/avatars/groups/000/002/081/cfb27ebe008c2636486089a759ea5c36/xlarge.png?1691095779</AvatarUrl>
    <AvatarUrl size="large">https://assets1-dev.my.umbc.edu/system/shared/avatars/groups/000/002/081/cfb27ebe008c2636486089a759ea5c36/large.png?1691095779</AvatarUrl>
    <AvatarUrl size="medium">https://assets3-dev.my.umbc.edu/system/shared/avatars/groups/000/002/081/cfb27ebe008c2636486089a759ea5c36/medium.png?1691095779</AvatarUrl>
    <AvatarUrl size="small">https://assets3-dev.my.umbc.edu/system/shared/avatars/groups/000/002/081/cfb27ebe008c2636486089a759ea5c36/small.png?1691095779</AvatarUrl>
    <AvatarUrl size="xsmall">https://assets4-dev.my.umbc.edu/system/shared/avatars/groups/000/002/081/cfb27ebe008c2636486089a759ea5c36/xsmall.png?1691095779</AvatarUrl>
    <AvatarUrl size="xxsmall">https://assets1-dev.my.umbc.edu/system/shared/avatars/groups/000/002/081/cfb27ebe008c2636486089a759ea5c36/xxsmall.png?1691095779</AvatarUrl>
    <Sponsor>UMBC</Sponsor>
    <ThumbnailUrl size="xxlarge">https://assets4-dev.my.umbc.edu/system/shared/thumbnails/news/000/154/576/737bb9b992cfe9a9f955d8e5875b7ebd/xxlarge.jpg?1763330590</ThumbnailUrl>
    <ThumbnailUrl size="xlarge">https://assets4-dev.my.umbc.edu/system/shared/thumbnails/news/000/154/576/737bb9b992cfe9a9f955d8e5875b7ebd/xlarge.jpg?1763330590</ThumbnailUrl>
    <ThumbnailUrl size="large">https://assets2-dev.my.umbc.edu/system/shared/thumbnails/news/000/154/576/737bb9b992cfe9a9f955d8e5875b7ebd/large.jpg?1763330590</ThumbnailUrl>
    <ThumbnailUrl size="medium">https://assets4-dev.my.umbc.edu/system/shared/thumbnails/news/000/154/576/737bb9b992cfe9a9f955d8e5875b7ebd/medium.jpg?1763330590</ThumbnailUrl>
    <ThumbnailUrl size="small">https://assets3-dev.my.umbc.edu/system/shared/thumbnails/news/000/154/576/737bb9b992cfe9a9f955d8e5875b7ebd/small.jpg?1763330590</ThumbnailUrl>
    <ThumbnailUrl size="xsmall">https://assets4-dev.my.umbc.edu/system/shared/thumbnails/news/000/154/576/737bb9b992cfe9a9f955d8e5875b7ebd/xsmall.jpg?1763330590</ThumbnailUrl>
    <ThumbnailUrl size="xxsmall">https://assets4-dev.my.umbc.edu/system/shared/thumbnails/news/000/154/576/737bb9b992cfe9a9f955d8e5875b7ebd/xxsmall.jpg?1763330590</ThumbnailUrl>
    <ThumbnailAltText>multimodal information retrieval</ThumbnailAltText>
    <PawCount>0</PawCount>
    <CommentCount>0</CommentCount>
    <CommentsAllowed>true</CommentsAllowed>
    <PostedAt>Mon, 17 Nov 2025 08:14:06 -0500</PostedAt>
  </NewsItem>
</News>
