World Wide
SeminarsConferencesWorkshopsCoursesJobsMapsFeedLibrary
TopicNeuro

FORCE learning

1 Seminar
Explore NeuroBrowse all domains
Explore NeuroBrowse all domains

Latest

SeminarNeuroscience

The centrality of population-level factors to network computation is demonstrated by a versatile approach for training spiking networks

Brian DePasquale
Princeton
May 3, 2023

Neural activity is often described in terms of population-level factors extracted from the responses of many neurons. Factors provide a lower-dimensional description with the aim of shedding light on network computations. Yet, mechanistically, computations are performed not by continuously valued factors but by interactions among neurons that spike discretely and variably. Models provide a means of bridging these levels of description. We developed a general method for training model networks of spiking neurons by leveraging factors extracted from either data or firing-rate-based networks. In addition to providing a useful model-building framework, this formalism illustrates how reliable and continuously valued factors can arise from seemingly stochastic spiking. Our framework establishes procedures for embedding this property in network models with different levels of realism. The relationship between spikes and factors in such networks provides a foundation for interpreting (and subtly redefining) commonly used quantities such as firing rates.

FORCE learning coverage

1 items

Seminar1
Domain spotlight

Explore how FORCE learning research is advancing inside Neuro.

Visit domain
January 2026
Full calendar →

Platform

  • Search
  • Seminars
  • Conferences
  • Jobs

Resources

  • Submit Content
  • About Us

© 2025 World Wide

Open knowledge for all • Started with World Wide Neuro • A 501(c)(3) Non-Profit Organization

Analytics consent required

World Wide relies on analytics signals to operate securely and keep research services available. Accept to continue, or leave the site.

Review the Privacy Policy for details about analytics processing.