Comment by ad8e
8 months ago
A quote from discord: "apparently alpha-zero has been replicated in open source as leela-zero, and then leela-zero got a bunch of improvements so it's far ahead of alpha-zero. but leela-zero was barely mentioned at all in the paper; it was only dismissed in the introduction and not compared in the benchmarks. in the stockfish discord they are saying that leela zero can already do everything in this paper including using the transformer architecture."
Leela zero was an amazing project improving on AlphaZero, showing the feasibility of large scale training with contributed cycles, and snatching the TCEC crown in Season 16
It forced Stockfish to up its game, essentially by adopting neural techniques themselves (though a different type, Stockfish uses nnue).
Yeah, they need to compare against the latest BT2 policy head. It's probably about the same performance.
BT2 is old news, we have BT4 now