social.tchncs.de is one of the many independent Mastodon servers you can use to participate in the fediverse.
A friendly server from Germany – which tends to attract techy people, but welcomes everybody. This is one of the oldest Mastodon instances.

Administered by:

Server stats:

3.8K
active users

#FeedForward

0 posts0 participants0 posts today
Philo Sophies<p><a href="https://planetearth.social/tags/Zoomposium" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>Zoomposium</span></a> with Dr. <a href="https://planetearth.social/tags/Patrick" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>Patrick</span></a> <a href="https://planetearth.social/tags/Krau%C3%9F" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>Krauß</span></a>: Building instructions for <a href="https://planetearth.social/tags/artificial" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>artificial</span></a> <a href="https://planetearth.social/tags/consciousness" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>consciousness</span></a></p><p>Transferring the various stages of <a href="https://planetearth.social/tags/Damasio" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>Damasio</span></a>'s <a href="https://planetearth.social/tags/theory" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>theory</span></a> of consciousness 1:1 into concrete <a href="https://planetearth.social/tags/schematics" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>schematics</span></a> for <a href="https://planetearth.social/tags/deeplearning" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>deeplearning</span></a>. To this end, strategies such as <a href="https://planetearth.social/tags/feedforward" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>feedforward</span></a> connections, <a href="https://planetearth.social/tags/recurrent" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>recurrent</span></a> <a href="https://planetearth.social/tags/connections" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>connections</span></a> in the form of <a href="https://planetearth.social/tags/reinforcement" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>reinforcement</span></a> learning and <a href="https://planetearth.social/tags/unsupervised" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>unsupervised</span></a> <a href="https://planetearth.social/tags/learning" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>learning</span></a> are used to simulate the <a href="https://planetearth.social/tags/biological" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>biological</span></a> <a href="https://planetearth.social/tags/processes" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>processes</span></a> of the <a href="https://planetearth.social/tags/neuronal" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>neuronal</span></a> <a href="https://planetearth.social/tags/networks" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>networks</span></a>. </p><p>More at: <a href="https://philosophies.de/index.php/2023/10/24/bauanleitung-kuenstliches-bewusstsein/" rel="nofollow noopener noreferrer" translate="no" target="_blank"><span class="invisible">https://</span><span class="ellipsis">philosophies.de/index.php/2023</span><span class="invisible">/10/24/bauanleitung-kuenstliches-bewusstsein/</span></a></p><p>or: <a href="https://youtu.be/rXamzyoggCo" rel="nofollow noopener noreferrer" translate="no" target="_blank"><span class="invisible">https://</span><span class="">youtu.be/rXamzyoggCo</span><span class="invisible"></span></a></p>
RoundSparrow 🐦<p><span class="h-card" translate="no"><a href="https://mastodon.world/@paninid" class="u-url mention" rel="nofollow noopener noreferrer" target="_blank">@<span>paninid</span></a></span> <a href="https://mastodon.social/tags/ActuallyAutistic" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>ActuallyAutistic</span></a> </p><p>Thank You! Appreciate it! <a href="https://mastodon.social/tags/FeedForward" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>FeedForward</span></a> is cool!</p><p><a href="https://www.GutknechtAutism.org" rel="nofollow noopener noreferrer" translate="no" target="_blank"><span class="invisible">https://www.</span><span class="">GutknechtAutism.org</span><span class="invisible"></span></a></p>
RoundSparrow 🐦<p><span class="h-card" translate="no"><a href="https://mastodon.social/@neotoy" class="u-url mention" rel="nofollow noopener noreferrer" target="_blank">@<span>neotoy</span></a></span> Agreed, <a href="https://mastodon.social/tags/FinnegansWakeMixingBoards" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>FinnegansWakeMixingBoards</span></a> </p><p>All mixed up <a href="https://mastodon.social/tags/Humanity2024" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>Humanity2024</span></a> </p><p>Thanks for <a href="https://mastodon.social/tags/FeedForward" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>FeedForward</span></a> </p><p>Hate sucks, <a href="https://mastodon.social/tags/OutGroupHateOpera" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>OutGroupHateOpera</span></a> </p><p><a href="https://www.youtube.com/watch?v=4RFK8Ft-GNQ" rel="nofollow noopener noreferrer" translate="no" target="_blank"><span class="invisible">https://www.</span><span class="ellipsis">youtube.com/watch?v=4RFK8Ft-GN</span><span class="invisible">Q</span></a></p>
Matt Willemsen<p>In a Striking Discovery, AI Shows Human-Like Memory Formation<br><a href="https://scitechdaily.com/in-a-striking-discovery-ai-shows-human-like-memory-formation/" rel="nofollow noopener noreferrer" translate="no" target="_blank"><span class="invisible">https://</span><span class="ellipsis">scitechdaily.com/in-a-striking</span><span class="invisible">-discovery-ai-shows-human-like-memory-formation/</span></a> <a href="https://mastodon.social/tags/AI" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>AI</span></a> <a href="https://mastodon.social/tags/human" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>human</span></a> <a href="https://mastodon.social/tags/brain" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>brain</span></a> <a href="https://mastodon.social/tags/memory" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>memory</span></a> <a href="https://mastodon.social/tags/hippocampus" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>hippocampus</span></a> <a href="https://mastodon.social/tags/modeling" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>modeling</span></a> <a href="https://mastodon.social/tags/FeedForward" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>FeedForward</span></a> <a href="https://mastodon.social/tags/Transformer" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>Transformer</span></a></p>
Victoria Stuart 🇨🇦 🏳️‍⚧️<p>Addendum 10</p><p>1 Wide Feedforward All You Need<br><a href="https://arxiv.org/abs/2309.01826" rel="nofollow noopener noreferrer" translate="no" target="_blank"><span class="invisible">https://</span><span class="">arxiv.org/abs/2309.01826</span><span class="invisible"></span></a></p><p>* 2 non-embedding components in transformer architecture: attention; feed forward network<br>* attention captures interdependencies betw. words regardless of position<br>* FFN non-linearly transforms ea. input token independently<br>* FFN (sig. fract. parameters) highly redundant<br>* modest drop in accuracy removing FFN on decoder layers &amp; sharing single FFN across encoder</p><p><a href="https://mastodon.social/tags/ML" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>ML</span></a> <a href="https://mastodon.social/tags/transformers" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>transformers</span></a> <a href="https://mastodon.social/tags/NeuralNetworks" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>NeuralNetworks</span></a> <a href="https://mastodon.social/tags/attention" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>attention</span></a> <a href="https://mastodon.social/tags/FFN" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>FFN</span></a> <a href="https://mastodon.social/tags/FeedForward" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>FeedForward</span></a></p>
gtbarry<p>A jargon-free explanation of how AI large language models work</p><p>Word vectors - Humans represent words with letters. Language models use a long list of numbers</p><p>Each layer of an LLM is a transformer - Each layer takes a sequence of inputs—each word—and adds information</p><p>Feed-forward layers predict the next word</p><p><a href="https://mastodon.social/tags/LLM" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>LLM</span></a> <a href="https://mastodon.social/tags/neuralnetwork" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>neuralnetwork</span></a> <a href="https://mastodon.social/tags/artificialintelligence" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>artificialintelligence</span></a> <a href="https://mastodon.social/tags/ai" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>ai</span></a> <a href="https://mastodon.social/tags/generativeAI" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>generativeAI</span></a> <a href="https://mastodon.social/tags/WordVectors" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>WordVectors</span></a> <a href="https://mastodon.social/tags/transformer" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>transformer</span></a> <a href="https://mastodon.social/tags/FeedForward" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>FeedForward</span></a> <a href="https://mastodon.social/tags/data" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>data</span></a> <a href="https://mastodon.social/tags/bigdata" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>bigdata</span></a> <a href="https://mastodon.social/tags/tech" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>tech</span></a> <a href="https://mastodon.social/tags/innovation" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>innovation</span></a></p><p><a href="https://arstechnica.com/science/2023/07/a-jargon-free-explanation-of-how-ai-large-language-models-work/" rel="nofollow noopener noreferrer" translate="no" target="_blank"><span class="invisible">https://</span><span class="ellipsis">arstechnica.com/science/2023/0</span><span class="invisible">7/a-jargon-free-explanation-of-how-ai-large-language-models-work/</span></a></p>
Victoria Stuart 🇨🇦 🏳️‍⚧️<p>Absorbing Phase Transitions in Artificial Deep Neural Networks<br><a href="https://arxiv.org/abs/2307.02284" rel="nofollow noopener noreferrer" translate="no" target="_blank"><span class="invisible">https://</span><span class="">arxiv.org/abs/2307.02284</span><span class="invisible"></span></a></p><p>To summarize, we believe that the this work places the order-to-chaos transition in the initialized artificial deep neural networks in the broader context of absorbing phase transitions, &amp; serves as the first step toward the systematic comparison between natural/biological &amp; artificial neural networks.<br>...</p><p><a href="https://mastodon.social/tags/NeuralNetworks" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>NeuralNetworks</span></a> <a href="https://mastodon.social/tags/MeanFieldTheory" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>MeanFieldTheory</span></a> <a href="https://mastodon.social/tags/SignalPropagation" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>SignalPropagation</span></a> <a href="https://mastodon.social/tags/PhaseTransitions" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>PhaseTransitions</span></a> <a href="https://mastodon.social/tags/backpropagation" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>backpropagation</span></a> <a href="https://mastodon.social/tags/feedforward" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>feedforward</span></a></p>
Michael Hackel<p><a href="https://bildung.social/tags/FediLZ" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>FediLZ</span></a> <a href="https://bildung.social/tags/Feedback" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>Feedback</span></a> <a href="https://bildung.social/tags/FeedForward" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>FeedForward</span></a> <a href="https://bildung.social/tags/matheedu" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>matheedu</span></a> </p><p>Ich habe Mal wieder mein Selbsteinschätzungsraster überarbeitet. Danke <span class="h-card"><a href="https://bildung.social/@noelte030" class="u-url mention" rel="nofollow noopener noreferrer" target="_blank">@<span>noelte030</span></a></span> für die Inspiration durch seine Bücher.</p><p>Was haltet ihr davon? Verbesserungsvorschläge?</p>
Michael Hackel<p><a href="https://bildung.social/tags/FediLZ" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>FediLZ</span></a> <a href="https://bildung.social/tags/Feedback" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>Feedback</span></a> <a href="https://bildung.social/tags/FeedForward" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>FeedForward</span></a> </p><p>Inspiriert durch einen Vortrag von <span class="h-card"><a href="https://bildung.social/@noelte030" class="u-url mention" rel="nofollow noopener noreferrer" target="_blank">@<span>noelte030</span></a></span> wollte ich mein Selbsteinschätzungsraster hinsichtlich der <a href="https://bildung.social/tags/4K" class="mention hashtag" rel="nofollow noopener noreferrer" target="_blank">#<span>4K</span></a> zu überarbeiten. Irgendwie stoße ich hier an meine Grenzen. Habt ihr Ideen?</p>