[{"id":977,"date":"2021-06-23T06:44:39","date_gmt":"2021-06-23T10:44:39","guid":{"rendered":"https:\/\/iconsconf.wpenginepowered.com\/2021\/?page_id=977"},"modified":"2021-06-23T06:44:39","modified_gmt":"2021-06-23T10:44:39","slug":"speakers","status":"publish","type":"page","link":"https:\/\/icons.ornl.gov\/2021\/speakers\/","title":{"rendered":"Speakers"},"content":{"rendered":"<h2>Keynote Speakers<\/h2>\n<h3>Dr. Emre Neftci<br \/>\n<img loading=\"lazy\" decoding=\"async\" class=\"size-full wp-image-979 alignnone\" src=\"https:\/\/iconsconf.wpenginepowered.com\/2021\/wp-content\/uploads\/2021\/06\/thanks_eneftci.jpg\" alt=\"\" width=\"155\" height=\"175\" \/><\/h3>\n<p>Dr. Emre Neftci received his M.Sc. degree in physics from EPFL in Switzerland, and his Ph.D. in 2010 at the Institute of Neuroinformatics at the University of Zurich and ETH Zurich. He is an associate professor in the Department of Cognitive Sciences and Computer Science at the University of California, Irvine and since July 2021, an institute director at the J\u00fclich Research Centre. His current research explores the bridges between neuroscience and machine learning, with a focus on the theoretical and computational modeling of learning algorithms that are best suited to neuromorphic hardware and non-von Neumann computing architectures.<\/p>\n<p>&nbsp;<\/p>\n<h3>Dr. Julie Grollier<\/h3>\n<p><img loading=\"lazy\" decoding=\"async\" class=\" wp-image-984\" src=\"https:\/\/iconsconf.wpenginepowered.com\/2021\/wp-content\/uploads\/2021\/06\/HD_007_210420__8508140-small-2-300x240.jpg\" alt=\"\" width=\"228\" height=\"182\" srcset=\"https:\/\/icons.ornl.gov\/2021\/wp-content\/uploads\/sites\/3\/2021\/06\/HD_007_210420__8508140-small-2-300x240.jpg 300w, https:\/\/icons.ornl.gov\/2021\/wp-content\/uploads\/sites\/3\/2021\/06\/HD_007_210420__8508140-small-2-1024x819.jpg 1024w, https:\/\/icons.ornl.gov\/2021\/wp-content\/uploads\/sites\/3\/2021\/06\/HD_007_210420__8508140-small-2-768x614.jpg 768w, https:\/\/icons.ornl.gov\/2021\/wp-content\/uploads\/sites\/3\/2021\/06\/HD_007_210420__8508140-small-2-1536x1229.jpg 1536w, https:\/\/icons.ornl.gov\/2021\/wp-content\/uploads\/sites\/3\/2021\/06\/HD_007_210420__8508140-small-2-2048x1638.jpg 2048w, https:\/\/icons.ornl.gov\/2021\/wp-content\/uploads\/sites\/3\/2021\/06\/HD_007_210420__8508140-small-2-360x288.jpg 360w, https:\/\/icons.ornl.gov\/2021\/wp-content\/uploads\/sites\/3\/2021\/06\/HD_007_210420__8508140-small-2-575x460.jpg 575w\" sizes=\"auto, (max-width: 228px) 100vw, 228px\" \/><\/p>\n<p><span lang=\"en-US\">Julie Grollier is a senior researcher in the CNRS\/Thales laboratory south of Paris, where she leads the team on nanodevices for neuromorphic computing. Her work is interdisciplinary, from the physics of spintronic and resistive switching materials to the development of learning algorithms for Artificial Intelligence. For more information:\u00a0<\/span><a id=\"LPlnk604191\" href=\"http:\/\/julie.grollier.free.fr\/\" target=\"_blank\" rel=\"noopener noreferrer\" data-auth=\"NotApplicable\" data-linkindex=\"0\"><span lang=\"en-US\">http:\/\/julie.grollier.free.fr\/<\/span><\/a><\/p>\n<p>&nbsp;<\/p>\n<h3>Dr. Yulia Sandamirskaya<\/h3>\n<p><img loading=\"lazy\" decoding=\"async\" class=\"alignnone  wp-image-985\" src=\"https:\/\/iconsconf.wpenginepowered.com\/2021\/wp-content\/uploads\/2021\/06\/Yulia-Portrait-002-300x200.jpg\" alt=\"\" width=\"242\" height=\"161\" srcset=\"https:\/\/icons.ornl.gov\/2021\/wp-content\/uploads\/sites\/3\/2021\/06\/Yulia-Portrait-002-300x200.jpg 300w, https:\/\/icons.ornl.gov\/2021\/wp-content\/uploads\/sites\/3\/2021\/06\/Yulia-Portrait-002-1024x683.jpg 1024w, https:\/\/icons.ornl.gov\/2021\/wp-content\/uploads\/sites\/3\/2021\/06\/Yulia-Portrait-002-768x512.jpg 768w, https:\/\/icons.ornl.gov\/2021\/wp-content\/uploads\/sites\/3\/2021\/06\/Yulia-Portrait-002-1536x1024.jpg 1536w, https:\/\/icons.ornl.gov\/2021\/wp-content\/uploads\/sites\/3\/2021\/06\/Yulia-Portrait-002-360x240.jpg 360w, https:\/\/icons.ornl.gov\/2021\/wp-content\/uploads\/sites\/3\/2021\/06\/Yulia-Portrait-002-690x460.jpg 690w, https:\/\/icons.ornl.gov\/2021\/wp-content\/uploads\/sites\/3\/2021\/06\/Yulia-Portrait-002.jpg 1600w\" sizes=\"auto, (max-width: 242px) 100vw, 242px\" \/><\/p>\n<p>Yulia Sandamirskaya is leading the Application Research team of the Neuromorphic Computing Lab at Intel. Her team develops spiking neuronal network based algorithms for neuromorphic hardware to demonstrate the potential of neuromorphic computing in real-world applications. Before joining Intel, Yulia led a group \u201cNeuromorphic Cognitive Robots\u201d in the Institute of Neuroinformatics at the University of Zurich and ETH Zurich. She was chairing EUCog\u2014the European Society for Artificial Cognitive Systems and coordinated an EU project NEUROTECH, creating and supporting the neuromorphic computing technology community in Europe.<\/p>\n","protected":false},"excerpt":{"rendered":"<p>Keynote Speakers Dr. Emre Neftci Dr. Emre Neftci received his M.Sc. degree in physics from EPFL in Switzerland, and his Ph.D. in 2010 at the Institute of Neuroinformatics at the University of Zurich and ETH Zurich. He is an associate professor in the Department of Cognitive Sciences and Computer Science at the University of California, [&hellip;]<\/p>\n","protected":false},"author":19,"featured_media":0,"parent":0,"menu_order":0,"comment_status":"closed","ping_status":"closed","template":"","meta":{"footnotes":""},"class_list":["post-977","page","type-page","status-publish","hentry"],"_links":{"self":[{"href":"https:\/\/icons.ornl.gov\/2021\/wp-json\/wp\/v2\/pages\/977","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/icons.ornl.gov\/2021\/wp-json\/wp\/v2\/pages"}],"about":[{"href":"https:\/\/icons.ornl.gov\/2021\/wp-json\/wp\/v2\/types\/page"}],"author":[{"embeddable":true,"href":"https:\/\/icons.ornl.gov\/2021\/wp-json\/wp\/v2\/users\/19"}],"replies":[{"embeddable":true,"href":"https:\/\/icons.ornl.gov\/2021\/wp-json\/wp\/v2\/comments?post=977"}],"version-history":[{"count":0,"href":"https:\/\/icons.ornl.gov\/2021\/wp-json\/wp\/v2\/pages\/977\/revisions"}],"wp:attachment":[{"href":"https:\/\/icons.ornl.gov\/2021\/wp-json\/wp\/v2\/media?parent=977"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}},{"id":964,"date":"2021-06-21T09:30:11","date_gmt":"2021-06-21T13:30:11","guid":{"rendered":"https:\/\/iconsconf.wpenginepowered.com\/2021\/?page_id=964"},"modified":"2021-08-23T13:00:42","modified_gmt":"2021-08-23T17:00:42","slug":"schedule","status":"publish","type":"page","link":"https:\/\/icons.ornl.gov\/2021\/schedule\/","title":{"rendered":"Schedule"},"content":{"rendered":"<h2><strong>\u00a0ICONS 2021 Virtual Conference Schedule<\/strong><\/h2>\n<p>All times are given in Eastern Daylight Time (EDT\/GMT-4).<\/p>\n<h3><strong>Tuesday, July 27, 2021<\/strong><\/h3>\n<p><strong>8:00 AM &#8211; 10:00 AM: Tutorial<\/strong><\/p>\n<ul>\n<li>&#8220;<a href=\"https:\/\/youtu.be\/O2-mT291ygg\">An Introduction to Deep Learning with Spiking Neural Networks using snnTorch<\/a>,&#8221; Jason Eshraghian<\/li>\n<\/ul>\n<p><strong>10:00 AM &#8211; 10:15 AM: Welcome<\/strong><\/p>\n<ul>\n<li>Tom Potok (General Chair), Melika Payvand and Katie Schuman (Program Co-Chairs)<\/li>\n<\/ul>\n<p><strong>10:15 AM &#8211; 11:15 AM: <a href=\"https:\/\/youtu.be\/B_6Okc1NoSY\">Keynote: Emre Neftci<\/a><\/strong><\/p>\n<p><strong>11:15 AM &#8211; 11:30 AM: Break<\/strong><\/p>\n<p><strong>11:30 AM &#8211; 1:15 PM: Full Talks on Biologically-Plausible Algorithms<\/strong><\/p>\n<ul>\n<li>&#8220;<span data-sheets-value=\"{&quot;1&quot;:2,&quot;2&quot;:&quot;Towards Biologically-Plausible Neuron Models and Firing Rates in High-Performance Deep Spiking Neural Networks&quot;}\" data-sheets-userformat=\"{&quot;2&quot;:513,&quot;3&quot;:{&quot;1&quot;:0},&quot;12&quot;:0}\"><a href=\"https:\/\/vimeo.com\/581170748\">Towards Biologically-Plausible Neuron Models and Firing Rates in High-Performance Deep Spiking Neural Networks<\/a>,&#8221; Chen Li and Steve Furber<\/span><\/li>\n<li>&#8220;<span data-sheets-value=\"{&quot;1&quot;:2,&quot;2&quot;:&quot;Efficient Biologically-Plausible Training of Spiking Neural Networks with Precise Timing&quot;}\" data-sheets-userformat=\"{&quot;2&quot;:513,&quot;3&quot;:{&quot;1&quot;:0},&quot;12&quot;:0}\"><a href=\"https:\/\/vimeo.com\/581170868\">Efficient Biologically-Plausible Training of Spiking Neural Networks with Precise Timing<\/a>,&#8221; Richard Boone, Wenrui Zhang and Peng Li<\/span><\/li>\n<li>&#8220;<a href=\"https:\/\/vimeo.com\/581171232\">Tightening the Biological Constraints on Gradient-Based Predictive Coding<\/a>,&#8221; Nicholas Alonso and Emre Neftci &#8212; <strong>BEST STUDENT PAPER AWARD<\/strong><\/li>\n<li>&#8220;<a href=\"https:\/\/vimeo.com\/581171165\">A spiking network model for semantic representation and replay-based association acquisition<\/a>,&#8221; Brian S. Robinson, Adam C. Polevoy, Sean L. McDaniel, Will Coon, Clara A. Scholl, Mark McLean and Erik C. Johnson<\/li>\n<\/ul>\n<p><strong>1:15 PM &#8211; 1:45 PM: Break<\/strong><\/p>\n<p><strong>1:45 PM &#8211; 3:30 PM: Lightning Talks on Algorithms<\/strong><\/p>\n<ul>\n<li>&#8220;<a href=\"https:\/\/vimeo.com\/581174606\">Low Power Hardware-In-The-Loop Neuromorphic Training Accelerator<\/a>,&#8221; J. Parker\u00a0Mitchell and Catherine Schuman<\/li>\n<li>&#8220;<a href=\"https:\/\/vimeo.com\/581175331\">Neuromorphic Design Using Reward-based STDP Learning on Event-Based Reconfigurable Cluster Architecture<\/a>,&#8221; Mahyar Shahsavari, David Thomas, Andrew Brown and Wayne Luk<\/li>\n<li>&#8220;<a href=\"https:\/\/vimeo.com\/581174938\">Training Spiking Neural Networks with Synaptic Plasticity under Integer Representation<\/a>,&#8221; Shruti Kulkarni, Maryam Parsa, Parker Mitchell and Catherine Schuman<\/li>\n<li>&#8220;<a href=\"https:\/\/vimeo.com\/581175517\">Temporal learning with biologically fitted SNN models<\/a>,&#8221; Yuan Zeng, Terrence C. Stewart, Zubayer Ibne Ferdous, Yevgeny Berdichevsky and Xiaochen Guo<\/li>\n<li>&#8220;<a href=\"https:\/\/vimeo.com\/581175227\">Computational Complexity of Neuromorphic Algorithms<\/a>,&#8221; Prasanna Date, Bill Kay, Catherine Schuman, Robert Patton and Thomas Potok<\/li>\n<li>&#8220;<a href=\"https:\/\/vimeo.com\/581175490\">Neko: a Library for Exploring Neuromorphic Learning Rules<\/a>,&#8221; Zixuan Zhao, Nathan Wycoff, Neil Getty, Rick Stevens and Fangfang Xia<\/li>\n<li>&#8220;<a href=\"https:\/\/vimeo.com\/581175406\">NeuroXplorer 1.0: An Extensible Framework for Architectural Exploration with Spiking Neural Networks<\/a>,&#8221; Adarsha Balaji, Shihao Song, Twisha Titirsha, Anup Das, Jeffrey Krichmar, Nikil Dutt, James Shackleford, Nagarajan Kandasamy and Francky Catthoor<\/li>\n<\/ul>\n<p><strong>3:30 PM &#8211; 5:00 PM: <a href=\"https:\/\/youtu.be\/KbvWFjTcnuI\">Invited Talks<\/a><\/strong><\/p>\n<ul>\n<li><span style=\"font-weight: 400\">Hal Greenwald, AFOSR<\/span><\/li>\n<li><span style=\"font-weight: 400\">Grace Hwang, NSF<\/span><\/li>\n<li><span style=\"font-weight: 400\">Robinson Pino, DOE ASCR<\/span><\/li>\n<li><span style=\"font-weight: 400\">Ken Whang, NSF<\/span><\/li>\n<\/ul>\n<h3><strong>Wednesday, July 28, 2021<\/strong><\/h3>\n<p><strong>10:00 AM &#8211; 10:15 AM: Welcome<\/strong><\/p>\n<ul>\n<li>Melika Payvand and Katie Schuman (Program Co-Chairs)<\/li>\n<\/ul>\n<p><strong>10:15 AM &#8211; 11:15 AM: <a href=\"https:\/\/youtu.be\/EZNv9aMMPsU\">Keynote: Julie Grollier<\/a><\/strong><\/p>\n<p><strong>11:15 AM &#8211; 11:30 AM: Break<\/strong><\/p>\n<p><strong>11:30 AM &#8211; 1:15 PM: Full Talks on Algorithms for Hardware<\/strong><\/p>\n<ul>\n<li>&#8220;<a href=\"https:\/\/vimeo.com\/581179883\">Connection Pruning for Deep Spiking Neural Networks with On-Chip Learning<\/a>,&#8221; Thao N.N. Nguyen, Bharadwaj Veeravalli and Xuanyao Fong<\/li>\n<li>&#8220;<a href=\"https:\/\/vimeo.com\/581180871\">Hessian Aware Quantization of Spiking Neural Networks<\/a>,&#8221; Hin Wai Lui and Emre Neftci<\/li>\n<li>&#8220;<a href=\"https:\/\/vimeo.com\/581180763\">Porting Deep Spiking Q-Networks to neuromorphic chip Loihi<\/a>,&#8221; Mahmoud Akl, Yulia Sandamirskaya, Florian Walter and Alois Knoll<\/li>\n<li>&#8220;<a href=\"https:\/\/vimeo.com\/581180338\">Accurate and Accelerated Neuromorphic Network Design Leveraging A Bayesian Hyperparameter Pareto Optimization Approach<\/a>,&#8221; Maryam Parsa, Catherine Schuman, Nitin Rathi, Amir Ziabari, Derek Rose, J. Parker Mitchell, J. Travis Johnston, Bill Kay, Steven Young and Kaushik Roy &#8212; <strong>BEST PAPER AWARD<\/strong><\/li>\n<\/ul>\n<p><strong>1:15 PM &#8211; 1:45 PM: Break<\/strong><\/p>\n<p><strong>1:45 PM &#8211; 3:30 PM: Lightning Talks on Hardware<\/strong><\/p>\n<ul>\n<li>&#8220;<a href=\"https:\/\/vimeo.com\/582067425\">Bridge Networks<\/a>,&#8221; Wilkie Olin-Ammentorp and Maxim Bazhenov<\/li>\n<li>&#8220;<a href=\"https:\/\/vimeo.com\/582067669\">MNIST classification using Neuromorphic Nanowire Networks<\/a>,&#8221; Ruomin Zhu, Alon Loeffler, Joel Hochstetter, Adrian Diaz-Alvarez, Tomonobu Nakayama, Adam Stieg, James Gimzewski, Joseph Lizier and Zdenka Kuncic<\/li>\n<li>&#8220;<a href=\"https:\/\/vimeo.com\/582067738\">Energy Efficient Neuromorphic Computing with beyond-CMOS Oscillatory Neural Networks<\/a>,&#8221; Corentin Delacour, Stefania Carapezzi, Gabriele Boschetto and Aida Todri-Sanial<\/li>\n<li>&#8220;<a href=\"https:\/\/vimeo.com\/582067991\">BEOL compatible cross-bar array of ferroelectric synapses<\/a>,&#8221; Zhenming Yu, Laura B\u00e9gon-Lours, Yigit Demirag and Bert Jan Offrein<\/li>\n<li>&#8220;<a href=\"https:\/\/vimeo.com\/582068282\">Reservoir Computing Using Networks of CMOS Logic Gates<\/a>,&#8221; Heidi Komkov, Liam Pocher, Alessandro Restelli, Brian Hunt and Daniel Lathrop<\/li>\n<li>&#8220;<a href=\"https:\/\/vimeo.com\/582068351\">The Need for Scalable Digital Spiking Neurons<\/a>,&#8221; Alexander Jones and Krishnamurthy Vemuru<\/li>\n<li>&#8220;<a href=\"https:\/\/vimeo.com\/582068425\">A Flexible FPGA Implementation of Morris-Lecar Neuron for Reproducing Different Neuronal Behaviors<\/a>,&#8221; Idir Mellal, David Crompton, Milos Popovic and Milad Lankarany<\/li>\n<\/ul>\n<p><strong>3:30 PM &#8211; 5:00 PM: Poster Session<\/strong><\/p>\n<ul>\n<li>&#8220;Neuromorphic System using LSI Neurons and MOSFET Synapses with Autonomous Learning Rule,&#8221; Mutsumi Kimura, Yoshinori Miyamae, Mitsuo Tamura and Yasuhiko Nakashima<\/li>\n<li>&#8220;Purely Spintronic Leaky Integrate-and-Fire Neurons,&#8221; Wesley Brigner, Naimul Hassan, Xuan Hu, Christopher Bennett, Felipe Garcia-Sanchez, Matthew Marinella, Jean Anne Incorvia and Joseph S. Friedman<\/li>\n<li>&#8220;Instruction Set for a Neuromorphic Co-Processor with On-Chip Learning,&#8221; Thao N.N. Nguyen, Bharadwaj Veeravalli and Xuanyao Fong<\/li>\n<li>&#8220;Linking Sparse Coding Dictionaries for Representation Learning,&#8221; Nicki Barari and Edward Kim<\/li>\n<li>&#8220;Integrate and Fire Neurons Based on Diffusive Memristors for Spiking Neural Networks,&#8221; Solomon Amsalu Chekol &#8212; <strong>BEST POSTER AWARD<\/strong><\/li>\n<li>&#8220;A Novel Facial Emotion Recognition system using In-Memory Computing,&#8221; Ajay BS<\/li>\n<\/ul>\n<h3><strong>Thursday, July 29, 2021<\/strong><\/h3>\n<p><strong>9:00 AM &#8211; 10:00AM: Late-Breaking Results Session: <\/strong>Email your interest in participating to Katie Schuman at schumancd[at]ornl.gov!<\/p>\n<ul>\n<li>Yigit Demirag, ETH Zurich<\/li>\n<li>Samiran Ganguly, University of Virginia<\/li>\n<li>David Mascarenas, Los Alamos National Laboratory<\/li>\n<li>Andrew Sornborger, Los Alamos National Laboratory<\/li>\n<li>Anurag Daram, University of Texas San Antonio<\/li>\n<\/ul>\n<p><strong>10:00 AM &#8211; 10:15 AM: Welcome and Best Paper\/Poster Announcements<\/strong><\/p>\n<ul>\n<li>Melika Payvand and Katie Schuman (Program Co-Chairs)<\/li>\n<\/ul>\n<p><strong>10:15 AM &#8211; 11:15 AM: <a href=\"https:\/\/youtu.be\/HNxWqO3z1iI\">Keynote: Yulia Sandamirskaya<\/a><\/strong><\/p>\n<p><strong>11:15 AM &#8211; 11:30 AM: Break<\/strong><\/p>\n<p><strong>11:30 AM &#8211; 1:15 PM: Full Talks on Applications<\/strong><\/p>\n<ul>\n<li>&#8220;<span data-sheets-value=\"{&quot;1&quot;:2,&quot;2&quot;:&quot;Superevents: Towards Native Semantic Segmentation for Event-based Cameras\\t&quot;}\" data-sheets-userformat=\"{&quot;2&quot;:513,&quot;3&quot;:{&quot;1&quot;:0},&quot;12&quot;:0}\"><a href=\"https:\/\/vimeo.com\/582076529\">Superevents: Towards Native Semantic Segmentation for Event-based Cameras<\/a>,&#8221; Weng Fei Low, Ankit Sonthalia, Zhi Gao, Andr\u00e9 van Schaik and Bharath Ramesh<\/span><\/li>\n<li>&#8220;<a href=\"https:\/\/vimeo.com\/582077027\">Dynamic Vision Sensor integration on FPGA-based CNN accelerators for high-speed visual classification<\/a>,&#8221; Alejandro Linares-Barranco, Antonio Rios-Navarro, Ricardo Tapiador Morales, Salvador Canas Moreno, Enrique Pi\u00f1ero Fuentes and Tobi Delbruck<\/li>\n<li>&#8220;<a href=\"https:\/\/vimeo.com\/582077304\">Spiking Neuromorphic Networks for Binary Tasks<\/a>,&#8221; James S. Plank, Chaohui Zheng, Catherine Schuman and Christopher Dean<\/li>\n<li>&#8220;<a href=\"https:\/\/vimeo.com\/582078035\">A System for Validating Resistive Neural Network Prototypes<\/a>,&#8221; Brian Hoskins, Wen Ma, Mitchell Fream, Osama Yousuf, Mathew Daniels, Jonathan Goodwill, Advait Madhavan, Hoang Tung, Mark Branstad, Muqing Liu, Rasmus Madsen, Jabez Mclelland, Gina Adam and Martin Lueker-Boden<\/li>\n<\/ul>\n<p><strong>1:15 PM &#8211; 1:45 PM: Break<\/strong><\/p>\n<p><strong>1:45 PM &#8211; 3:30 PM: Lightning Talks on Applications<\/strong><\/p>\n<ul>\n<li>&#8220;<a href=\"https:\/\/vimeo.com\/582073574\">Neuromorphic Computing for Autonomous Racing<\/a>,&#8221; Robert Patton, Catherine Schuman, Shruti R. Kulkarni, Maryam Parsa, John Mitchell, N. Quentin Haas, Christopher Stahl, Spencer Paulissen, Prasanna Date, Thomas Potok and Shay Snyder<\/li>\n<li>&#8220;<a href=\"https:\/\/vimeo.com\/582073661\">Drone Virtual Fence Using a Neuromorphic Camera<\/a>,&#8221; Terrence Stewart, Marc-Antoine Drouin, Guillaume Gagn\u00e9 and Guy Godin<\/li>\n<li>&#8220;<a href=\"https:\/\/vimeo.com\/582073763\">Neuromorphic Graph Algorithms: Cycle Detection, Odd Cycle Detection, and Max Flow<\/a>,&#8221; Bill Kay, Catherine Schuman, Jade O&#8217;Connor, Prasanna Date and Thomas Potok<\/li>\n<li>&#8220;<a href=\"https:\/\/vimeo.com\/582073913\">Signals to Spikes for Neuromorphic Regulated Reservoir Computing and EMG Hand Gesture Recognition<\/a>,&#8221; Nikhil Garg, Ismael Balafrej, Yann Beilliard, Dominique Drouin, Fabien Alibart and Jean Rouat<\/li>\n<li>&#8220;<a href=\"https:\/\/vimeo.com\/582074705\">Implementation of Dragonfly Interception Model on Neuromorphic Hardware<\/a>,&#8221; Luke Parker, Frances Chance and Suma Cardwell<\/li>\n<li>&#8220;<a href=\"https:\/\/vimeo.com\/582075064\">2D Histogram based Region Proposal (H2RP) Algorithm for Neuromorphic Vision Sensors<\/a>,&#8221; Ranajay Medya, Sai Sukruth Bezugam, Dwijay Bane and Manan Suri<\/li>\n<\/ul>\n<p><strong>3:30 PM &#8211; 5:00 PM: Doctoral Consortium\u00a0<\/strong><\/p>\n<ul>\n<li>Sathwika Bavikadi, George Mason University<\/li>\n<li>Wesley Brigner, University of Texas at Dallas<\/li>\n<li>Hagar Hendy, Rochester Institute of Technology<\/li>\n<li>Ziru Li, Duke University<\/li>\n<li>Sebastian Siegel, Forschungszentrum Juelich<\/li>\n<\/ul>\n<p>&nbsp;<\/p>\n<p>&nbsp;<\/p>\n","protected":false},"excerpt":{"rendered":"<p>\u00a0ICONS 2021 Virtual Conference Schedule All times are given in Eastern Daylight Time (EDT\/GMT-4). Tuesday, July 27, 2021 8:00 AM &#8211; 10:00 AM: Tutorial &#8220;An Introduction to Deep Learning with Spiking Neural Networks using snnTorch,&#8221; Jason Eshraghian 10:00 AM &#8211; 10:15 AM: Welcome Tom Potok (General Chair), Melika Payvand and Katie Schuman (Program Co-Chairs) 10:15 [&hellip;]<\/p>\n","protected":false},"author":25,"featured_media":0,"parent":0,"menu_order":0,"comment_status":"closed","ping_status":"closed","template":"","meta":{"footnotes":""},"class_list":["post-964","page","type-page","status-publish","hentry"],"_links":{"self":[{"href":"https:\/\/icons.ornl.gov\/2021\/wp-json\/wp\/v2\/pages\/964","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/icons.ornl.gov\/2021\/wp-json\/wp\/v2\/pages"}],"about":[{"href":"https:\/\/icons.ornl.gov\/2021\/wp-json\/wp\/v2\/types\/page"}],"author":[{"embeddable":true,"href":"https:\/\/icons.ornl.gov\/2021\/wp-json\/wp\/v2\/users\/25"}],"replies":[{"embeddable":true,"href":"https:\/\/icons.ornl.gov\/2021\/wp-json\/wp\/v2\/comments?post=964"}],"version-history":[{"count":0,"href":"https:\/\/icons.ornl.gov\/2021\/wp-json\/wp\/v2\/pages\/964\/revisions"}],"wp:attachment":[{"href":"https:\/\/icons.ornl.gov\/2021\/wp-json\/wp\/v2\/media?parent=964"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}},{"id":953,"date":"2021-06-09T14:39:07","date_gmt":"2021-06-09T18:39:07","guid":{"rendered":"https:\/\/iconsconf.wpenginepowered.com\/2021\/?page_id=953"},"modified":"2021-07-02T07:22:54","modified_gmt":"2021-07-02T11:22:54","slug":"students-icons","status":"publish","type":"page","link":"https:\/\/icons.ornl.gov\/2021\/students-icons\/","title":{"rendered":"Students @ ICONS"},"content":{"rendered":"<p><strong>Virtual Doctoral Consortium<\/strong><\/p>\n<p>For the second year, ICONS 2021 will host a virtual doctoral consortium. PhD students who are currently working on their dissertation research and would like feedback from the ICONS community are eligible to participate. Doctoral consortium participants will provide 10 minute videos describing their dissertation work ahead of ICONS 2021. During the virtual meeting, doctoral consortium participants will have the opportunity to interact with conference participants, receive feedback about their research, and receive information about job opportunities, internships, and career development.<\/p>\n<p><strong>Submit Your Entry<\/strong>: Submit to participate in the virtual doctoral consortium: <a href=\"https:\/\/forms.gle\/iFTvHViPZA7wQijp6\">https:\/\/forms.gle\/iFTvHViPZA7wQijp6<\/a><\/p>\n<p><strong>Submission Deadline<\/strong>: July 9, 2021 AoE<\/p>\n<p>&nbsp;<\/p>\n<p><strong>Virtual Student Poster Competition<\/strong><\/p>\n<p>This is a virtual poster competition! Undergraduate and graduate students are eligible for the competition. This year\u2019s poster competition will be virtual, with each student submitting a 3-minute video describing their poster. The student can choose to do a typical \u201cposter\u201d format, or they can choose to present their work in a more unconventional way in the video! We will have a poster \u201cnetworking\u201d session during the conference dates July 27-29, during which conference attendees can interact in real time with poster presenters.<\/p>\n<p><strong>Submit Your Entry<\/strong>: Submit to participate in the virtual student poster competition: <a href=\"https:\/\/forms.gle\/9de4rMUj4rujfcRj9\">https:\/\/forms.gle\/9de4rMUj4rujfcRj9<\/a><\/p>\n<p><strong>Submission Deadline<\/strong>: July 9, 2021 AoE<\/p>\n","protected":false},"excerpt":{"rendered":"<p>Virtual Doctoral Consortium For the second year, ICONS 2021 will host a virtual doctoral consortium. PhD students who are currently working on their dissertation research and would like feedback from the ICONS community are eligible to participate. Doctoral consortium participants will provide 10 minute videos describing their dissertation work ahead of ICONS 2021. During the [&hellip;]<\/p>\n","protected":false},"author":25,"featured_media":0,"parent":0,"menu_order":0,"comment_status":"closed","ping_status":"closed","template":"","meta":{"footnotes":""},"class_list":["post-953","page","type-page","status-publish","hentry"],"_links":{"self":[{"href":"https:\/\/icons.ornl.gov\/2021\/wp-json\/wp\/v2\/pages\/953","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/icons.ornl.gov\/2021\/wp-json\/wp\/v2\/pages"}],"about":[{"href":"https:\/\/icons.ornl.gov\/2021\/wp-json\/wp\/v2\/types\/page"}],"author":[{"embeddable":true,"href":"https:\/\/icons.ornl.gov\/2021\/wp-json\/wp\/v2\/users\/25"}],"replies":[{"embeddable":true,"href":"https:\/\/icons.ornl.gov\/2021\/wp-json\/wp\/v2\/comments?post=953"}],"version-history":[{"count":0,"href":"https:\/\/icons.ornl.gov\/2021\/wp-json\/wp\/v2\/pages\/953\/revisions"}],"wp:attachment":[{"href":"https:\/\/icons.ornl.gov\/2021\/wp-json\/wp\/v2\/media?parent=953"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}},{"id":950,"date":"2021-06-09T14:38:51","date_gmt":"2021-06-09T18:38:51","guid":{"rendered":"https:\/\/iconsconf.wpenginepowered.com\/2021\/?page_id=950"},"modified":"2021-07-29T14:50:45","modified_gmt":"2021-07-29T18:50:45","slug":"committees","status":"publish","type":"page","link":"https:\/\/icons.ornl.gov\/2021\/committees\/","title":{"rendered":"Committees"},"content":{"rendered":"<h2>Organizing Committee<\/h2>\n<p><strong>General Chair:<\/strong> Tom Potok, Oak Ridge National Laboratory<\/p>\n<p><strong>Program Co-Chairs:<\/strong> Melika Payvand, ETH Zurich, and Catherine Schuman, Oak Ridge National Laboratory<\/p>\n<p><strong>Technical Organization Point of Contact and Webmaster:<\/strong> Prasanna Date, Oak Ridge National Laboratory<\/p>\n<p><strong>Asia Point of Contact:<\/strong> Mutsumi Kimura, Nara Institute of Science and Technology (NAIST)<\/p>\n<p><strong>Doctoral Consortium Organizer:<\/strong> Cory Merkel, Rochester Institute of Technology<\/p>\n<h2>Program Committee<\/h2>\n<p><strong>Program Co-Chairs:<\/strong><\/p>\n<p>Melika Payvand, ETH Zurich<\/p>\n<p>Catherine Schuman, Oak Ridge National Laboratory<\/p>\n<p><strong>Committee:<\/strong><\/p>\n<p>Brad Aimone, Sandia National Laboratories<\/p>\n<p>Ahmedullah Aziz, University of Tennessee, Knoxville<\/p>\n<p>Prasanna Balaprakash, Argonne National Laboratory<\/p>\n<p>Simeon Bamford, IIT<\/p>\n<p>Arindam Basu, Nanyang Technological University<\/p>\n<p>Irem Boybat, IBM Research, Zurich<\/p>\n<p>Federico Corradi, imec<\/p>\n<p>Kelvin Fong, National University of Singapore<\/p>\n<p>Yoshihiko Horio, Tohoku University<\/p>\n<p>David Kappel, University of G\u00f6ttingen<\/p>\n<p>Edward Kim, Drexel University<\/p>\n<p>Kyung Min Kim, KAIST<\/p>\n<p>Cory Merkel, Rochester Institute of Technology<\/p>\n<p>Alice Mizrahi, Thales<\/p>\n<p>Yasuhiko Nakashima, NAIST<\/p>\n<p>Alice Parker, University of Southern California<\/p>\n<p>Ivan Schuller, University of California San Diego<\/p>\n<p>William Severa, Sandia National Laboratories<\/p>\n<p>Hakaru Tamukoh, Kyushu Institute of Technology<\/p>\n<p>Andreas Wild, Intel<\/p>\n<h2>Best Paper Committee<\/h2>\n<p>Melika Payvand, ETH Zurich<\/p>\n<p>Brad Aimone, Sandia National Laboratories<\/p>\n<p>Kelvin Fong, National University of Singapore<\/p>\n<h2>Student Poster Competition Committee<\/h2>\n<p>Janette Briones, NASA Glenn Research Center<\/p>\n<p>Frances Chance, Sandia National Laboratories<\/p>\n<p>Joe Hays, US Naval Research Lab<\/p>\n<p>Brian Hoskins, NIST<\/p>\n<p>Shruti Kulkarni, Oak Ridge National Laboratory<\/p>\n","protected":false},"excerpt":{"rendered":"<p>Organizing Committee General Chair: Tom Potok, Oak Ridge National Laboratory Program Co-Chairs: Melika Payvand, ETH Zurich, and Catherine Schuman, Oak Ridge National Laboratory Technical Organization Point of Contact and Webmaster: Prasanna Date, Oak Ridge National Laboratory Asia Point of Contact: Mutsumi Kimura, Nara Institute of Science and Technology (NAIST) Doctoral Consortium Organizer: Cory Merkel, Rochester [&hellip;]<\/p>\n","protected":false},"author":25,"featured_media":0,"parent":0,"menu_order":0,"comment_status":"closed","ping_status":"closed","template":"","meta":{"footnotes":""},"class_list":["post-950","page","type-page","status-publish","hentry"],"_links":{"self":[{"href":"https:\/\/icons.ornl.gov\/2021\/wp-json\/wp\/v2\/pages\/950","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/icons.ornl.gov\/2021\/wp-json\/wp\/v2\/pages"}],"about":[{"href":"https:\/\/icons.ornl.gov\/2021\/wp-json\/wp\/v2\/types\/page"}],"author":[{"embeddable":true,"href":"https:\/\/icons.ornl.gov\/2021\/wp-json\/wp\/v2\/users\/25"}],"replies":[{"embeddable":true,"href":"https:\/\/icons.ornl.gov\/2021\/wp-json\/wp\/v2\/comments?post=950"}],"version-history":[{"count":0,"href":"https:\/\/icons.ornl.gov\/2021\/wp-json\/wp\/v2\/pages\/950\/revisions"}],"wp:attachment":[{"href":"https:\/\/icons.ornl.gov\/2021\/wp-json\/wp\/v2\/media?parent=950"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}},{"id":945,"date":"2021-06-07T15:29:35","date_gmt":"2021-06-07T19:29:35","guid":{"rendered":"https:\/\/iconsconf.wpenginepowered.com\/2021\/?page_id=945"},"modified":"2021-07-21T10:00:58","modified_gmt":"2021-07-21T14:00:58","slug":"registration","status":"publish","type":"page","link":"https:\/\/icons.ornl.gov\/2021\/registration\/","title":{"rendered":"Registration"},"content":{"rendered":"<p>ICONS 2021 will be held as a virtual conference. Please use <a href=\"https:\/\/utconferences.eventsair.com\/international-conference-on-neuromorphic-systems-icons-2021\/registration\/Site\/Register\">this link<\/a> to register for ICONS 2021. The registration deadline is July 26, 2021.<\/p>\n<p>The registration rates are:<\/p>\n<p>General registration: USD 100.00<br \/>\nStudent registration: USD 20.00<\/p>\n<p>We look forward to virtually meeting you in July 2021\u2026<\/p>\n","protected":false},"excerpt":{"rendered":"<p>ICONS 2021 will be held as a virtual conference. Please use this link to register for ICONS 2021. The registration deadline is July 26, 2021. The registration rates are: General registration: USD 100.00 Student registration: USD 20.00 We look forward to virtually meeting you in July 2021\u2026<\/p>\n","protected":false},"author":25,"featured_media":0,"parent":0,"menu_order":0,"comment_status":"closed","ping_status":"closed","template":"","meta":{"footnotes":""},"class_list":["post-945","page","type-page","status-publish","hentry"],"_links":{"self":[{"href":"https:\/\/icons.ornl.gov\/2021\/wp-json\/wp\/v2\/pages\/945","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/icons.ornl.gov\/2021\/wp-json\/wp\/v2\/pages"}],"about":[{"href":"https:\/\/icons.ornl.gov\/2021\/wp-json\/wp\/v2\/types\/page"}],"author":[{"embeddable":true,"href":"https:\/\/icons.ornl.gov\/2021\/wp-json\/wp\/v2\/users\/25"}],"replies":[{"embeddable":true,"href":"https:\/\/icons.ornl.gov\/2021\/wp-json\/wp\/v2\/comments?post=945"}],"version-history":[{"count":0,"href":"https:\/\/icons.ornl.gov\/2021\/wp-json\/wp\/v2\/pages\/945\/revisions"}],"wp:attachment":[{"href":"https:\/\/icons.ornl.gov\/2021\/wp-json\/wp\/v2\/media?parent=945"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}},{"id":929,"date":"2021-03-15T23:02:36","date_gmt":"2021-03-16T03:02:36","guid":{"rendered":"https:\/\/iconsconf.wpenginepowered.com\/2021\/?page_id=929"},"modified":"2022-03-17T12:47:56","modified_gmt":"2022-03-17T16:47:56","slug":"previous-editions","status":"publish","type":"page","link":"https:\/\/icons.ornl.gov\/2021\/previous-editions\/","title":{"rendered":"Previous Editions"},"content":{"rendered":"<ul>\n<li><a href=\"https:\/\/iconsconf.wpenginepowered.com\/2020\">ICONS 2020<\/a><\/li>\n<li><a href=\"https:\/\/ornlcda.github.io\/icons2019\/index.html\">ICONS 2019<\/a><\/li>\n<li><a href=\"https:\/\/ornlcda.github.io\/icons2018\/\">ICONS 2018<\/a><\/li>\n<\/ul>\n","protected":false},"excerpt":{"rendered":"<p>ICONS 2020 ICONS 2019 ICONS 2018<\/p>\n","protected":false},"author":25,"featured_media":0,"parent":0,"menu_order":3,"comment_status":"closed","ping_status":"closed","template":"","meta":{"footnotes":""},"class_list":["post-929","page","type-page","status-publish","hentry"],"_links":{"self":[{"href":"https:\/\/icons.ornl.gov\/2021\/wp-json\/wp\/v2\/pages\/929","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/icons.ornl.gov\/2021\/wp-json\/wp\/v2\/pages"}],"about":[{"href":"https:\/\/icons.ornl.gov\/2021\/wp-json\/wp\/v2\/types\/page"}],"author":[{"embeddable":true,"href":"https:\/\/icons.ornl.gov\/2021\/wp-json\/wp\/v2\/users\/25"}],"replies":[{"embeddable":true,"href":"https:\/\/icons.ornl.gov\/2021\/wp-json\/wp\/v2\/comments?post=929"}],"version-history":[{"count":0,"href":"https:\/\/icons.ornl.gov\/2021\/wp-json\/wp\/v2\/pages\/929\/revisions"}],"wp:attachment":[{"href":"https:\/\/icons.ornl.gov\/2021\/wp-json\/wp\/v2\/media?parent=929"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}},{"id":915,"date":"2021-02-04T14:01:09","date_gmt":"2021-02-04T19:01:09","guid":{"rendered":"https:\/\/iconsconf.wpenginepowered.com\/2021\/?page_id=915"},"modified":"2021-03-15T23:12:19","modified_gmt":"2021-03-16T03:12:19","slug":"subscribe","status":"publish","type":"page","link":"https:\/\/icons.ornl.gov\/2021\/subscribe\/","title":{"rendered":"Subscribe"},"content":{"rendered":"<p>Follow us on <a href=\"https:\/\/twitter.com\/icons_neuro\">Twitter @icons_neuro<\/a>!<\/p>\n<p>&nbsp;<\/p>\n<script type=\"text\/javascript\">\n\/* <![CDATA[ *\/\nvar gform;gform||(document.addEventListener(\"gform_main_scripts_loaded\",function(){gform.scriptsLoaded=!0}),document.addEventListener(\"gform\/theme\/scripts_loaded\",function(){gform.themeScriptsLoaded=!0}),window.addEventListener(\"DOMContentLoaded\",function(){gform.domLoaded=!0}),gform={domLoaded:!1,scriptsLoaded:!1,themeScriptsLoaded:!1,isFormEditor:()=>\"function\"==typeof InitializeEditor,callIfLoaded:function(o){return!(!gform.domLoaded||!gform.scriptsLoaded||!gform.themeScriptsLoaded&&!gform.isFormEditor()||(gform.isFormEditor()&&console.warn(\"The use of gform.initializeOnLoaded() is deprecated in the form editor context and will be removed in Gravity Forms 3.1.\"),o(),0))},initializeOnLoaded:function(o){gform.callIfLoaded(o)||(document.addEventListener(\"gform_main_scripts_loaded\",()=>{gform.scriptsLoaded=!0,gform.callIfLoaded(o)}),document.addEventListener(\"gform\/theme\/scripts_loaded\",()=>{gform.themeScriptsLoaded=!0,gform.callIfLoaded(o)}),window.addEventListener(\"DOMContentLoaded\",()=>{gform.domLoaded=!0,gform.callIfLoaded(o)}))},hooks:{action:{},filter:{}},addAction:function(o,r,e,t){gform.addHook(\"action\",o,r,e,t)},addFilter:function(o,r,e,t){gform.addHook(\"filter\",o,r,e,t)},doAction:function(o){gform.doHook(\"action\",o,arguments)},applyFilters:function(o){return gform.doHook(\"filter\",o,arguments)},removeAction:function(o,r){gform.removeHook(\"action\",o,r)},removeFilter:function(o,r,e){gform.removeHook(\"filter\",o,r,e)},addHook:function(o,r,e,t,n){null==gform.hooks[o][r]&&(gform.hooks[o][r]=[]);var d=gform.hooks[o][r];null==n&&(n=r+\"_\"+d.length),gform.hooks[o][r].push({tag:n,callable:e,priority:t=null==t?10:t})},doHook:function(r,o,e){var t;if(e=Array.prototype.slice.call(e,1),null!=gform.hooks[r][o]&&((o=gform.hooks[r][o]).sort(function(o,r){return o.priority-r.priority}),o.forEach(function(o){\"function\"!=typeof(t=o.callable)&&(t=window[t]),\"action\"==r?t.apply(null,e):e[0]=t.apply(null,e)})),\"filter\"==r)return e[0]},removeHook:function(o,r,t,n){var e;null!=gform.hooks[o][r]&&(e=(e=gform.hooks[o][r]).filter(function(o,r,e){return!!(null!=n&&n!=o.tag||null!=t&&t!=o.priority)}),gform.hooks[o][r]=e)}});\n\/* ]]> *\/\n<\/script>\n\n                <div class='gf_browser_gecko gform_wrapper gform_legacy_markup_wrapper gform-theme--no-framework' data-form-theme='legacy' data-form-index='0' id='gform_wrapper_5' >\n                        <div class='gform_heading'>\n                            <p class='gform_description'>Get all the latest updates about ICONS 2020!<\/p>\n                        <\/div><form method='post' enctype='multipart\/form-data'  id='gform_5'  action='\/2021\/wp-json\/wp\/v2\/pages' data-formid='5' novalidate>\n                        <div class='gform-body gform_body'><ul id='gform_fields_5' class='gform_fields top_label form_sublabel_below description_below validation_below'><li id=\"field_5_4\" class=\"gfield gfield--type-name gfield_contains_required field_sublabel_below gfield--no-description field_description_below field_validation_below gfield_visibility_visible\"  ><label class='gfield_label gform-field-label gfield_label_before_complex' >Name<span class=\"gfield_required\"><span class=\"gfield_required gfield_required_asterisk\">*<\/span><\/span><\/label><div class='ginput_complex ginput_container ginput_container--name no_prefix has_first_name no_middle_name has_last_name no_suffix gf_name_has_2 ginput_container_name gform-grid-row' id='input_5_4'>\n                            \n                            <span id='input_5_4_3_container' class='name_first gform-grid-col gform-grid-col--size-auto' >\n                                                    <input type='text' name='input_4.3' id='input_5_4_3' value=''   aria-required='true'     \/>\n                                                    <label for='input_5_4_3' class='gform-field-label gform-field-label--type-sub '>First<\/label>\n                                                <\/span>\n                            \n                            <span id='input_5_4_6_container' class='name_last gform-grid-col gform-grid-col--size-auto' >\n                                                    <input type='text' name='input_4.6' id='input_5_4_6' value=''   aria-required='true'     \/>\n                                                    <label for='input_5_4_6' class='gform-field-label gform-field-label--type-sub '>Last<\/label>\n                                                <\/span>\n                            \n                        <\/div><\/li><li id=\"field_5_6\" class=\"gfield gfield--type-text gfield_contains_required field_sublabel_below gfield--has-description field_description_below field_validation_below gfield_visibility_visible\"  ><label class='gfield_label gform-field-label' for='input_5_6'>Affiliation<span class=\"gfield_required\"><span class=\"gfield_required gfield_required_asterisk\">*<\/span><\/span><\/label><div class='ginput_container ginput_container_text'><input name='input_6' id='input_5_6' type='text' value='' class='medium'  aria-describedby=\"gfield_description_5_6\"   aria-required=\"true\" aria-invalid=\"false\"   \/><\/div><div class='gfield_description' id='gfield_description_5_6'>University or organization you are affiliated to, e.g. Massachusetts Institute of Technology, Intel Labs etc.<\/div><\/li><li id=\"field_5_7\" class=\"gfield gfield--type-text gfield_contains_required field_sublabel_below gfield--has-description field_description_below field_validation_below gfield_visibility_visible\"  ><label class='gfield_label gform-field-label' for='input_5_7'>Position<span class=\"gfield_required\"><span class=\"gfield_required gfield_required_asterisk\">*<\/span><\/span><\/label><div class='ginput_container ginput_container_text'><input name='input_7' id='input_5_7' type='text' value='' class='medium'  aria-describedby=\"gfield_description_5_7\"   aria-required=\"true\" aria-invalid=\"false\"   \/><\/div><div class='gfield_description' id='gfield_description_5_7'>Your position at your affiliated university or organization, e.g. Student, Professor, Scientist etc.<\/div><\/li><li id=\"field_5_5\" class=\"gfield gfield--type-email gfield_contains_required field_sublabel_below gfield--no-description field_description_below field_validation_below gfield_visibility_visible\"  ><label class='gfield_label gform-field-label' for='input_5_5'>Email<span class=\"gfield_required\"><span class=\"gfield_required gfield_required_asterisk\">*<\/span><\/span><\/label><div class='ginput_container ginput_container_email'>\n                            <input name='input_5' id='input_5_5' type='email' value='' class='medium'    aria-required=\"true\" aria-invalid=\"false\"  \/>\n                        <\/div><\/li><\/ul><\/div>\n        <div class='gform-footer gform_footer top_label'> <input type='submit' id='gform_submit_button_5' class='gform_button button' onclick='gform.submission.handleButtonClick(this);' data-submission-type='submit' value='Subscribe'  \/> \n            <input type='hidden' class='gform_hidden' name='gform_submission_method' data-js='gform_submission_method_5' value='postback' \/>\n            <input type='hidden' class='gform_hidden' name='gform_theme' data-js='gform_theme_5' id='gform_theme_5' value='legacy' \/>\n            <input type='hidden' class='gform_hidden' name='gform_style_settings' data-js='gform_style_settings_5' id='gform_style_settings_5' value='[]' \/>\n            <input type='hidden' class='gform_hidden' name='is_submit_5' value='1' \/>\n            <input type='hidden' class='gform_hidden' name='gform_submit' value='5' \/>\n            \n            <input type='hidden' class='gform_hidden' name='gform_currency' data-currency='USD' value='xzBkKMzv2e9vzF25lFs4gBFa+RIwKcAq\/I19CI84L9OcBGfnkDoTC4p5di\/xy3XP4m2VWo7TfqkB8Jfh47eRWx3VepVJzq\/N7c2aEoKwBwpz8Dw=' \/>\n            <input type='hidden' class='gform_hidden' name='gform_unique_id' value='' \/>\n            <input type='hidden' class='gform_hidden' name='state_5' value='WyJbXSIsIjJlMzNkNmZjY2Y4N2VkZmU0YWY0ZTQ0ZTIyNWI2ZjM5Il0=' \/>\n            <input type='hidden' autocomplete='off' class='gform_hidden' name='gform_target_page_number_5' id='gform_target_page_number_5' value='0' \/>\n            <input type='hidden' autocomplete='off' class='gform_hidden' name='gform_source_page_number_5' id='gform_source_page_number_5' value='1' \/>\n            <input type='hidden' name='gform_field_values' value='' \/>\n            \n        <\/div>\n                        <\/form>\n                        <\/div><script type=\"text\/javascript\">\n\/* <![CDATA[ *\/\n gform.initializeOnLoaded( function() {gformInitSpinner( 5, 'http:\/\/icons.ornl.gov\/2021\/wp-content\/plugins\/gravityforms\/images\/spinner.svg', true );jQuery('#gform_ajax_frame_5').on('load',function(){var contents = jQuery(this).contents().find('*').html();var is_postback = contents.indexOf('GF_AJAX_POSTBACK') >= 0;if(!is_postback){return;}var form_content = jQuery(this).contents().find('#gform_wrapper_5');var is_confirmation = jQuery(this).contents().find('#gform_confirmation_wrapper_5').length > 0;var is_redirect = contents.indexOf('gformRedirect(){') >= 0;var is_form = form_content.length > 0 && ! is_redirect && ! is_confirmation;var mt = parseInt(jQuery('html').css('margin-top'), 10) + parseInt(jQuery('body').css('margin-top'), 10) + 100;if(is_form){jQuery('#gform_wrapper_5').html(form_content.html());if(form_content.hasClass('gform_validation_error')){jQuery('#gform_wrapper_5').addClass('gform_validation_error');} else {jQuery('#gform_wrapper_5').removeClass('gform_validation_error');}setTimeout( function() { \/* delay the scroll by 50 milliseconds to fix a bug in chrome *\/  }, 50 );if(window['gformInitDatepicker']) {gformInitDatepicker();}if(window['gformInitPriceFields']) {gformInitPriceFields();}var current_page = jQuery('#gform_source_page_number_5').val();gformInitSpinner( 5, 'http:\/\/icons.ornl.gov\/2021\/wp-content\/plugins\/gravityforms\/images\/spinner.svg', true );jQuery(document).trigger('gform_page_loaded', [5, current_page]);window['gf_submitting_5'] = false;}else if(!is_redirect){var confirmation_content = jQuery(this).contents().find('.GF_AJAX_POSTBACK').html();if(!confirmation_content){confirmation_content = contents;}jQuery('#gform_wrapper_5').replaceWith(confirmation_content);jQuery(document).trigger('gform_confirmation_loaded', [5]);window['gf_submitting_5'] = false;wp.a11y.speak(jQuery('#gform_confirmation_message_5').text());}else{jQuery('#gform_5').append(contents);if(window['gformRedirect']) {gformRedirect();}}jQuery(document).trigger(\"gform_pre_post_render\", [{ formId: \"5\", currentPage: \"current_page\", abort: function() { this.preventDefault(); } }]);        if (event && event.defaultPrevented) {                return;        }        const gformWrapperDiv = document.getElementById( \"gform_wrapper_5\" );        if ( gformWrapperDiv ) {            const visibilitySpan = document.createElement( \"span\" );            visibilitySpan.id = \"gform_visibility_test_5\";            gformWrapperDiv.insertAdjacentElement( \"afterend\", visibilitySpan );        }        const visibilityTestDiv = document.getElementById( \"gform_visibility_test_5\" );        let postRenderFired = false;        function triggerPostRender() {            if ( postRenderFired ) {                return;            }            postRenderFired = true;            gform.core.triggerPostRenderEvents( 5, current_page );            if ( visibilityTestDiv ) {                visibilityTestDiv.parentNode.removeChild( visibilityTestDiv );            }        }        function debounce( func, wait, immediate ) {            var timeout;            return function() {                var context = this, args = arguments;                var later = function() {                    timeout = null;                    if ( !immediate ) func.apply( context, args );                };                var callNow = immediate && !timeout;                clearTimeout( timeout );                timeout = setTimeout( later, wait );                if ( callNow ) func.apply( context, args );            };        }        const debouncedTriggerPostRender = debounce( function() {            triggerPostRender();        }, 200 );        if ( visibilityTestDiv && visibilityTestDiv.offsetParent === null ) {            const observer = new MutationObserver( ( mutations ) => {                mutations.forEach( ( mutation ) => {                    if ( mutation.type === 'attributes' && visibilityTestDiv.offsetParent !== null ) {                        debouncedTriggerPostRender();                        observer.disconnect();                    }                });            });            observer.observe( document.body, {                attributes: true,                childList: false,                subtree: true,                attributeFilter: [ 'style', 'class' ],            });        } else {            triggerPostRender();        }    } );} ); \n\/* ]]> *\/\n<\/script>\n\n","protected":false},"excerpt":{"rendered":"<p>Follow us on Twitter @icons_neuro! &nbsp;<\/p>\n","protected":false},"author":25,"featured_media":0,"parent":0,"menu_order":2,"comment_status":"closed","ping_status":"closed","template":"","meta":{"footnotes":""},"class_list":["post-915","page","type-page","status-publish","hentry"],"_links":{"self":[{"href":"https:\/\/icons.ornl.gov\/2021\/wp-json\/wp\/v2\/pages\/915","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/icons.ornl.gov\/2021\/wp-json\/wp\/v2\/pages"}],"about":[{"href":"https:\/\/icons.ornl.gov\/2021\/wp-json\/wp\/v2\/types\/page"}],"author":[{"embeddable":true,"href":"https:\/\/icons.ornl.gov\/2021\/wp-json\/wp\/v2\/users\/25"}],"replies":[{"embeddable":true,"href":"https:\/\/icons.ornl.gov\/2021\/wp-json\/wp\/v2\/comments?post=915"}],"version-history":[{"count":0,"href":"https:\/\/icons.ornl.gov\/2021\/wp-json\/wp\/v2\/pages\/915\/revisions"}],"wp:attachment":[{"href":"https:\/\/icons.ornl.gov\/2021\/wp-json\/wp\/v2\/media?parent=915"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}},{"id":907,"date":"2021-02-04T13:52:44","date_gmt":"2021-02-04T18:52:44","guid":{"rendered":"https:\/\/iconsconf.wpenginepowered.com\/2021\/?page_id=907"},"modified":"2021-06-25T06:37:19","modified_gmt":"2021-06-25T10:37:19","slug":"for-authors","status":"publish","type":"page","link":"https:\/\/icons.ornl.gov\/2021\/for-authors\/","title":{"rendered":"For Authors"},"content":{"rendered":"<h2>Call for Papers<\/h2>\n<div class=\"page\" title=\"Page 1\">\n<div class=\"section\">\n<div class=\"layoutArea\">\n<div class=\"column\">\n<p>With the looming end of the \u201cMoore\u2019s Law\u201d era, there is an emerging challenge to \u201ccreate a new type of computer that can proactively interpret and learn from data, solve unfamiliar problems using what it has learned, and operate with the energy efficiency of the human brain.\u201d<\/p>\n<p>Neuromorphic computing will play a major role in this challenge and has the potential to transform the way we use computers through new materials, new brain-inspired chips, greater understanding of neuroscience, and breakthroughs in machine understanding\/intelligence. Neuromorphic computing systems have the potential to mimic the functionality of neural systems in the brain, which we believe will lead to more powerful and efficient computing paradigms. The goal of this conference is to bring together leading researchers in neuromorphic computing to present new research, develop new collaborations, and provide a forum to publish work in this area.<\/p>\n<p>&nbsp;<\/p>\n<p><strong>RESEARCH PAPERS ARE REQUESTED FOR TOPICS ON NEUROMORPHIC COMPUTING, SPECIFICALLY IN FOUR FOCUS AREAS:<\/strong><\/p>\n<ul>\n<li>Systems, architectures, and circuits\n<ul>\n<li>Network, neuron, and synapse models<\/li>\n<li>Non-von Neumann computing architectures and models<\/li>\n<li>Emerging devices and hardware implementations<\/li>\n<li>Event or spike-based systems<\/li>\n<li>Neuromorphic circuits<\/li>\n<\/ul>\n<\/li>\n<li>Machine intelligence algorithms for programming or training neuromorphic devices\n<ul>\n<li>Supervised and unsupervised learning methods<\/li>\n<li>Biologically-inspired algorithms<\/li>\n<li>Adaptations to existing algorithms for use on or with neuromorphic systems<\/li>\n<\/ul>\n<\/li>\n<li>Applications for and use-cases of neuromorphic systems\n<ul>\n<li>Applications where neuromorphic systems have the potential to outperform state-of-the-art techniques<\/li>\n<li>Suggestions for benchmark tasks for neuromorphic computing<\/li>\n<li>Neuromorphic datasets<\/li>\n<\/ul>\n<\/li>\n<li>Supporting software and systems for neuromorphic systems\n<ul>\n<li>Efficient simulation techniques for hardware and large-scale networks<\/li>\n<li>Compilers and programming frameworks<\/li>\n<li>Visualization tools<\/li>\n<\/ul>\n<\/li>\n<\/ul>\n<\/div>\n<\/div>\n<\/div>\n<\/div>\n<div class=\"page\" title=\"Page 2\">\n<div class=\"section\">\n<div class=\"layoutArea\">\n<div class=\"column\">\n<p>&nbsp;<\/p>\n<p>Note: Submissions outside the scope of these areas, including from materials science and neuroscience, will also be considered (especially for lightning talks and posters), although they are not the focus of this conference.<\/p>\n<p>&nbsp;<\/p>\n<p><strong>WE ARE ACCEPTING SUBMISSIONS IN THE FOLLOWING FORMATS:<\/strong><\/p>\n<ul>\n<li>Full papers (6-8 pages), which will be considered for full (20 minute) presentations. Full papers should present original research and will be included in the conference proceedings.<\/li>\n<li>Short papers (3-4 pages), which will be considered for full presentations and\/or lightning talks. Short papers can be position papers or present preliminary results and will be included in the conference proceedings.<\/li>\n<li>Extended abstracts (1 page) for lightning talks and\/or poster presentations. Extended abstracts will not be included in the conference proceedings.<\/li>\n<li>Tutorial submissions (2-3 pages) for 1-2 hour tutorial sessions. Tutorials should include a hands-on component for tutorial attendees to work on or interact with neuromorphic software or hardware. Tutorials should be led by no more than three facilitators. Unlike paper and abstract submissions, tutorial submissions should be submitted via email to Katie Schuman at schumancd [at] ornl.gov.<\/li>\n<li>Special session submissions (2-3 pages) for 1-2 hour special sessions. Special sessions should include invited presentations on a specific topic. Special session submissions will not be included in the conference proceedings. Unlike paper and abstract submissions, special session submissions should be submitted via email to Katie Schuman at schumancd [at] ornl.gov.<\/li>\n<\/ul>\n<\/div>\n<\/div>\n<\/div>\n<\/div>\n<h2><\/h2>\n<p>&nbsp;<\/p>\n<h2>Submission Instructions<\/h2>\n<ol>\n<li>Format your paper according to the <a href=\"https:\/\/www.acm.org\/publications\/proceedings-template\">ACM SIGCONF<\/a> article template. Both LaTeX and Microsoft Word templates are available.<\/li>\n<li>Submit your paper using the <a href=\"https:\/\/easychair.org\/conferences\/?conf=icons2021\">EasyChair portal<\/a> for ICONS 2020.<\/li>\n<\/ol>\n<h2><\/h2>\n<h2><strong>Video Preparation Instructions<\/strong><\/h2>\n<p>All accepted papers will prepare a pre-recorded presentation for the ICONS virtual conference.\u00a0 There are several options for recording your video, including utilizing any meeting software (e.g., Zoom, Microsoft Teams, WebEx, Skype, etc.).\u00a0 You may also record a voice-over in PowerPoint and convert to mp4, or do a screen capture with voiceover with tools such as QuickTime.\u00a0 To convert your video to the appropriate format or bitrate, we recommend f<a href=\"https:\/\/ffmpeg.org\/\">fmpeg<\/a>.<\/p>\n<ul>\n<li><strong><em>All files must be in MP4 Format<\/em><\/strong><\/li>\n<li>Less than or equal to 1mbps bitrate: To check the bit rate, right click on the file name, click on properties, go to the details tab, and look for total bitrate.<\/li>\n<li>Resolution = maximum 720p HD<\/li>\n<li>Please use the following\u00a0<strong>naming convention: FirstAuthorLastName_PaperID.mp4, where PaperID is your submission ID<\/strong>\u00a0from the paper submission system.<\/li>\n<\/ul>\n<p>Papers accepted as full presentations should prepare videos no longer than 20 minutes and papers accepted as lightning talks should prepare videos no longer than 10 minutes.<\/p>\n<p>Instructions for submitting the videos have been sent to the authors via email.\u00a0 Presentation video submission is due on July 7, 2021.<\/p>\n<h2>Important Dates<\/h2>\n<ul>\n<li><strong><del>April 15, 2021<\/del> <ins>April 30, 2021<\/ins><\/strong> &#8211; Paper Submission Deadline<\/li>\n<li><strong><del>May 31, 2021<\/del> <ins>June 7, 2021<\/ins><\/strong>&#8211; Notification of Acceptance<\/li>\n<li><strong>July 27-29, 2021<\/strong> &#8211; Virtual Conference!<\/li>\n<\/ul>\n","protected":false},"excerpt":{"rendered":"<p>Call for Papers With the looming end of the \u201cMoore\u2019s Law\u201d era, there is an emerging challenge to \u201ccreate a new type of computer that can proactively interpret and learn from data, solve unfamiliar problems using what it has learned, and operate with the energy efficiency of the human brain.\u201d Neuromorphic computing will play a [&hellip;]<\/p>\n","protected":false},"author":25,"featured_media":0,"parent":0,"menu_order":1,"comment_status":"closed","ping_status":"closed","template":"","meta":{"footnotes":""},"class_list":["post-907","page","type-page","status-publish","hentry"],"_links":{"self":[{"href":"https:\/\/icons.ornl.gov\/2021\/wp-json\/wp\/v2\/pages\/907","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/icons.ornl.gov\/2021\/wp-json\/wp\/v2\/pages"}],"about":[{"href":"https:\/\/icons.ornl.gov\/2021\/wp-json\/wp\/v2\/types\/page"}],"author":[{"embeddable":true,"href":"https:\/\/icons.ornl.gov\/2021\/wp-json\/wp\/v2\/users\/25"}],"replies":[{"embeddable":true,"href":"https:\/\/icons.ornl.gov\/2021\/wp-json\/wp\/v2\/comments?post=907"}],"version-history":[{"count":0,"href":"https:\/\/icons.ornl.gov\/2021\/wp-json\/wp\/v2\/pages\/907\/revisions"}],"wp:attachment":[{"href":"https:\/\/icons.ornl.gov\/2021\/wp-json\/wp\/v2\/media?parent=907"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}]