dmm: (Default)
[personal profile] dmm
A rather overwhelming flow of interesting new things lately, a very incomplete selection here: twitter.com/ComputingByArts

In particular, it turns out that the creator of Keras wrote this text in the last chapter of his book 3 years ago:

blog.keras.io/the-future-of-deep-learning.html

It resonates a lot with what I am doing.

***

"At a high-level, the main directions in which I see promise are:

  • Models closer to general-purpose computer programs, built on top of far richer primitives than our current differentiable layers—this is how we will get to reasoning and abstraction, the fundamental weakness of current models.
  • New forms of learning that make the above possible—allowing models to move away from just differentiable transforms.
  • Models that require less involvement from human engineers—it shouldn't be your job to tune knobs endlessly.
  • Greater, systematic reuse of previously learned features and architectures; meta-learning systems based on reusable and modular program subroutines.

Additionally, do note that these considerations are not specific to the sort of supervised learning that has been the bread and butter of deep learning so far—rather, they are applicable to any form of machine learning, including unsupervised, self-supervised, and reinforcement learning. It is not fundamentally important where your labels come from or what your training loop looks like; these different branches of machine learning are just different facets of a same construct."

***

I am going to include the concluding summary of his text in the comment.

Profile

dmm: (Default)
Dataflow matrix machines (by Anhinga anhinga)

September 2025

S M T W T F S
 1 23456
78910111213
14151617181920
21222324252627
282930    

Most Popular Tags

Page Summary

Style Credit

Expand Cut Tags

No cut tags
Page generated Dec. 29th, 2025 07:31 pm
Powered by Dreamwidth Studios