Русские видео

Сейчас в тренде

Иностранные видео


Скачать с ютуб Learning Graph Cellular Automata | Daniele Grattarola в хорошем качестве

Learning Graph Cellular Automata | Daniele Grattarola 2 года назад


Если кнопки скачивания не загрузились НАЖМИТЕ ЗДЕСЬ или обновите страницу
Если возникают проблемы со скачиванием, пожалуйста напишите в поддержку по адресу внизу страницы.
Спасибо за использование сервиса savevideohd.ru



Learning Graph Cellular Automata | Daniele Grattarola

Join the Learning on Graphs and Geometry Reading Group: https://hannes-stark.com/logag-readin... Paper "Learning Graph Cellular Automata": https://arxiv.org/abs/2110.14237 Abstract: Cellular automata (CA) are a class of computational models that exhibit rich dynamics emerging from the local interaction of cells arranged in a regular lattice. In this work we focus on a generalised version of typical CA, called graph cellular automata (GCA), in which the lattice structure is replaced by an arbitrary graph. In particular, we extend previous work that used convolutional neural networks to learn the transition rule of conventional CA and we use graph neural networks to learn a variety of transition rules for GCA. First, we present a general-purpose architecture for learning GCA, and we show that it can represent any arbitrary GCA with finite and discrete state space. Then, we test our approach on three different tasks: 1) learning the transition rule of a GCA on a Voronoi tessellation; 2) imitating the behaviour of a group of flocking agents; 3) learning a rule that converges to a desired target state. Authors: Daniele Grattarola, Lorenzo Livi, Cesare Alippi Twitter Hannes:   / hannesstaerk   Twitter Dominique:   / dom_beaini   Twitter Valence Discovery:   / valence_ai   Reading Group Slack: https://logag.slack.com/join/shared_i... ~ 00:00 Intro 01:13 Cellular Automata 07:17 Graph Cellular Automata 10:17 Learning GCA 24:52 GNCA on Voronoi Tessellation 35:15 GNCA for Agent-Based Modelling 39:26 GNCA that converge to a fixed target 01:01:16 Future Research 01:03:32 Q&A

Comments