<?xml version="1.0"?>
<News hasArchived="false" page="1" pageCount="1" pageSize="10" timestamp="Mon, 20 Apr 2026 20:23:52 -0400" url="https://dev.my.umbc.edu/groups/umbc-ai/posts.xml?tag=concepts">
  <NewsItem contentIssues="true" id="138553" important="false" status="posted" url="https://dev.my.umbc.edu/groups/umbc-ai/posts/138553">
  <Title>Talk: Visual Concept Learning Beyond Appearances, 3:30pm 2/8</Title>
  <Tagline>Modernizing a couple of classic ideas</Tagline>
  <Body>
    <![CDATA[
    <div class="html-content"><h5>PPR Distinguished Speaker</h5><div><br></div><h4>Visual Concept Learning Beyond Appearances: Modernizing a Couple of Classic Ideas</h4><h5><a href="https://yezhouyang.engineering.asu.edu/" rel="nofollow external" class="bo">Dr. Yezhou Yang</a><br>Arizona State University</h5><div><br></div><h5>3:30-4:45 pm ET, Thur. Feb. 8, 2024</h5><h5>ITE 325b &amp; via <a href="https://umbc.webex.com/meet/gokhale" rel="nofollow external" class="bo">WebEx</a></h5><div><br></div><div>The goal of <a href="https://en.wikipedia.org/wiki/Computer_vision" rel="nofollow external" class="bo">Computer Vision</a>, as coined by <a href="https://en.wikipedia.org/wiki/David_Marr_(neuroscientist)" rel="nofollow external" class="bo">Marr</a>, is to develop algorithms to answer "What are", "Where at", "When from" visual appearance. The speaker, among others, recognizes the importance of studying underlying entities and relations beyond visual appearance, following an Active Perception paradigm. This talk will present the speaker's efforts over the last decade, ranging from 1) reasoning beyond appearance for vision and language tasks (VQA, captioning, T2I, etc.), and addressing their evaluation misalignment, through 2) reasoning about implicit properties, to 3) their roles in a Robotic visual concept learning framework. The talk will also feature the Active Perception Group (APG)'s projects addressing emerging challenges of the nation in automated mobility and intelligent transportation domains, at the ASU School of Computing and Augmented Intelligence (SCAI).</div><div><br></div><div><a href="https://yezhouyang.engineering.asu.edu/" rel="nofollow external" class="bo"><strong>Yezhou (YZ) Yang</strong></a> is an Associate Professor and a Fulton Entrepreneurial Professor in the School of Computing and Augmented Intelligence (SCAI) at Arizona State University. He founded and directs the ASU Active Perception Group, and currently serves as the topic lead (situation awareness) at the Institute of Automated Mobility, Arizona Commerce Authority. He is also a thrust lead (AVAI) at Advanced Communications Technologies (ACT, a Science and Technology Center under the New Economy Initiative, Arizona). His work includes exploring visual primitives and representation learning in visual (and language) understanding, grounding them by natural language and high-level reasoning over the primitives for intelligent systems, secure/robust AI, and V&amp;L model evaluation alignment. Yang is a recipient of the Qualcomm Innovation Fellowship 2011, the NSF CAREER award 2018, and the Amazon AWS Machine Learning Research Award 2019. He received his Ph.D. from the University of Maryland at College Park, and B.E. from Zhejiang University, China. He is a co- founder of ARGOS Vision Inc, an ASU spin-off company.</div><div><br></div><div>The Advances in Perception, Prediction, and Reasoning (PPR) talks are organized and hosted by UMBC Professor <a href="https://www.tejasgokhale.com/" rel="nofollow external" class="bo">Tejas Gokhale</a>.</div><div><br></div></div>
]]>
  </Body>
  <Summary>PPR Distinguished Speaker     Visual Concept Learning Beyond Appearances: Modernizing a Couple of Classic Ideas  Dr. Yezhou Yang Arizona State University     3:30-4:45 pm ET, Thur. Feb. 8, 2024...</Summary>
  <Website>https://www.tejasgokhale.com/seminar.html</Website>
  <TrackingUrl>https://dev.my.umbc.edu/api/v0/pixel/news/138553/guest@my.umbc.edu/78889d31153122f68f1a32b354be22a0/api/pixel</TrackingUrl>
  <Tag>ai</Tag>
  <Tag>concepts</Tag>
  <Tag>vision</Tag>
  <Group token="umbc-ai">UMBC AI</Group>
  <GroupUrl>https://dev.my.umbc.edu/groups/umbc-ai</GroupUrl>
  <AvatarUrl>https://assets4-dev.my.umbc.edu/system/shared/avatars/groups/000/002/081/cfb27ebe008c2636486089a759ea5c36/xsmall.png?1691095779</AvatarUrl>
  <AvatarUrl size="original">https://assets2-dev.my.umbc.edu/system/shared/avatars/groups/000/002/081/cfb27ebe008c2636486089a759ea5c36/original.png?1691095779</AvatarUrl>
  <AvatarUrl size="xxlarge">https://assets1-dev.my.umbc.edu/system/shared/avatars/groups/000/002/081/cfb27ebe008c2636486089a759ea5c36/xxlarge.png?1691095779</AvatarUrl>
  <AvatarUrl size="xlarge">https://assets1-dev.my.umbc.edu/system/shared/avatars/groups/000/002/081/cfb27ebe008c2636486089a759ea5c36/xlarge.png?1691095779</AvatarUrl>
  <AvatarUrl size="large">https://assets1-dev.my.umbc.edu/system/shared/avatars/groups/000/002/081/cfb27ebe008c2636486089a759ea5c36/large.png?1691095779</AvatarUrl>
  <AvatarUrl size="medium">https://assets3-dev.my.umbc.edu/system/shared/avatars/groups/000/002/081/cfb27ebe008c2636486089a759ea5c36/medium.png?1691095779</AvatarUrl>
  <AvatarUrl size="small">https://assets3-dev.my.umbc.edu/system/shared/avatars/groups/000/002/081/cfb27ebe008c2636486089a759ea5c36/small.png?1691095779</AvatarUrl>
  <AvatarUrl size="xsmall">https://assets4-dev.my.umbc.edu/system/shared/avatars/groups/000/002/081/cfb27ebe008c2636486089a759ea5c36/xsmall.png?1691095779</AvatarUrl>
  <AvatarUrl size="xxsmall">https://assets1-dev.my.umbc.edu/system/shared/avatars/groups/000/002/081/cfb27ebe008c2636486089a759ea5c36/xxsmall.png?1691095779</AvatarUrl>
  <Sponsor>Advances in Perception, Prediction, and Reasoning Lab</Sponsor>
  <ThumbnailUrl size="xxlarge">https://assets2-dev.my.umbc.edu/system/shared/thumbnails/news/000/138/553/a86785c3911819cf5e537c492b14aa74/xxlarge.jpg?1706728119</ThumbnailUrl>
  <ThumbnailUrl size="xlarge">https://assets3-dev.my.umbc.edu/system/shared/thumbnails/news/000/138/553/a86785c3911819cf5e537c492b14aa74/xlarge.jpg?1706728119</ThumbnailUrl>
  <ThumbnailUrl size="large">https://assets1-dev.my.umbc.edu/system/shared/thumbnails/news/000/138/553/a86785c3911819cf5e537c492b14aa74/large.jpg?1706728119</ThumbnailUrl>
  <ThumbnailUrl size="medium">https://assets3-dev.my.umbc.edu/system/shared/thumbnails/news/000/138/553/a86785c3911819cf5e537c492b14aa74/medium.jpg?1706728119</ThumbnailUrl>
  <ThumbnailUrl size="small">https://assets4-dev.my.umbc.edu/system/shared/thumbnails/news/000/138/553/a86785c3911819cf5e537c492b14aa74/small.jpg?1706728119</ThumbnailUrl>
  <ThumbnailUrl size="xsmall">https://assets3-dev.my.umbc.edu/system/shared/thumbnails/news/000/138/553/a86785c3911819cf5e537c492b14aa74/xsmall.jpg?1706728119</ThumbnailUrl>
  <ThumbnailUrl size="xxsmall">https://assets3-dev.my.umbc.edu/system/shared/thumbnails/news/000/138/553/a86785c3911819cf5e537c492b14aa74/xxsmall.jpg?1706728119</ThumbnailUrl>
  <PawCount>0</PawCount>
  <CommentCount>0</CommentCount>
  <CommentsAllowed>true</CommentsAllowed>
  <PostedAt>Wed, 31 Jan 2024 14:22:06 -0500</PostedAt>
  <EditAt>Tue, 27 Feb 2024 17:40:25 -0500</EditAt>
</NewsItem>
</News>
