• 0 Posts
  • 13 Comments
Joined 1 year ago
cake
Cake day: June 16th, 2023

help-circle








  • Explaining what happens in a neural net is trivial. All they do is approximate (generally) nonlinear functions with a long series of multiplications and some rectification operations.

    That isn’t the hard part, you can track all of the math at each step.

    The hard part is stating a simple explanation for the semantic meaning of each operation.

    When a human solves a problem, we like to think that it occurs in discrete steps with simple goals: “First I will draw a diagram and put in the known information, then I will write the governing equations, then simplify them for the physics of the problem”, and so on.

    Neural nets don’t appear to solve problems that way, each atomic operation does not have that semantic meaning. That is the root of all the reporting about how they are such ‘black boxes’ and researchers ‘don’t understand’ how they work.




  • In the language of classical probability theory: the models learn the probability distribution of words in language from their training data, and then approximate this distribution using their parameters and network structure.

    When given a prompt, they then calculate the conditional probabilities of the next word, given the words they have already seen, and sample from that space.

    It is a rather simple idea, all of the complexity comes from trying to give the high-dimensional vector operations (that it is doing to calculate conditional probabilities) a human meaning.


  • No, it isn’t. The key conceit is they are removing water from the river and evaporating it.

    The water isn’t ‘lost’ it is still part of the hydrosphere, but it is made non-local. That water goes into the air and will go on to be rain in some place far away from the community where it was sourced. This will absolutely contrubute to local droughts and water insecurity.