Anyone interested in participating in?One reason for publishing this program is that we are setting up a public, distributed effort to repeat the work. Working together, and especially when starting on a smaller scale, it will take less than 1700 years to get a good network (which you can feed into this program, suddenly making it strong). Further details about this will be announced soon.
Leela-zero project
-
Isla
- Beginner
- Posts: 8
- Joined: Fri Aug 18, 2017 12:33 am
- GD Posts: 0
- Has thanked: 1 time
- Been thanked: 4 times
Leela-zero project
https://github.com/gcp/leela-zero
Last edited by Isla on Wed Oct 25, 2017 1:22 am, edited 1 time in total.
-
yoyoma
- Lives in gote
- Posts: 653
- Joined: Mon Apr 19, 2010 8:45 pm
- GD Posts: 0
- Location: Austin, Texas, USA
- Has thanked: 54 times
- Been thanked: 213 times
Re: Leela-zero project
Nice, this has potential! Thanks to Leela's author, Gian-Carlo Pascutto, for setting this up. I hope it can take off.
I'll look into trying it out this weekend. If anyone else tries it out let us know how it goes!
I'll look into trying it out this weekend. If anyone else tries it out let us know how it goes!
-
speedchase
- Lives in sente
- Posts: 800
- Joined: Sun Dec 04, 2011 4:36 pm
- Rank: AGA 2kyu
- GD Posts: 0
- Universal go server handle: speedchase
- Has thanked: 139 times
- Been thanked: 122 times
Re: Leela-zero project
I would like to participate, however this concerns me somewhat. The license in ThreadPool.h is not compatible with the GPLV3.
Edit: Also, more importantly, I have no idea what this means:The code is released under the GPLv3 or later, except for ThreadPool.h, which has a specific license mentioned in that file.
OpenCL C++ headers, https://github.com/KhronosGroup/OpenCL-CLHPP (You can just copy input_cl.hpp into CL/cl2.hpp)
Last edited by speedchase on Tue Oct 24, 2017 9:51 pm, edited 1 time in total.
-
jeromie
- Lives in sente
- Posts: 902
- Joined: Fri Jan 31, 2014 7:12 pm
- Rank: AGA 3k
- GD Posts: 0
- Universal go server handle: jeromie
- Location: Fort Collins, CO
- Has thanked: 319 times
- Been thanked: 287 times
Re: Leela-zero project
The license in Threadpool.h seems sufficiently permissive to not cause me any worries about contributing to the project, and since most of the code is licensed under the GPL I don't think there's any concern about contributions to the project as a whole being stolen. Is there anything about the license that gives you pause?
-
speedchase
- Lives in sente
- Posts: 800
- Joined: Sun Dec 04, 2011 4:36 pm
- Rank: AGA 2kyu
- GD Posts: 0
- Universal go server handle: speedchase
- Has thanked: 139 times
- Been thanked: 122 times
Re: Leela-zero project
My concern is that generally anything distributed alongside GPLv3 code must also be GPLv3. I am not a lawyer, so I have no idea how important this is in this context. I may just be overreacting.
Re: Leela-zero project
That's simply completely wrong: https://www.gnu.org/licenses/quick-guide-gplv3.en.htmlspeedchase wrote:The license in ThreadPool.h is not compatible with the GPLV3...My concern is that generally anything distributed alongside GPLv3 code must also be GPLv3.
There are a lot of open source licenses compatible with GPLv3, including the license in that file, which seems to be the zlib license. The Khronos headers are MIT/X11 licensed - also compatible. (If it weren't possible to join them then no GPL program would be able to use OpenGL and thus no 3D on Linux!)
-
Krama
- Lives in gote
- Posts: 436
- Joined: Mon Jan 06, 2014 3:46 am
- Rank: KGS 5 kyu
- GD Posts: 0
- Has thanked: 1 time
- Been thanked: 38 times
Re: Leela-zero project
It would probably be too silly of me to expect deepmind to release the weights of their NN.
What would they lose by doing that?
If nothing then why won't they do it?
What would they lose by doing that?
If nothing then why won't they do it?
-
moha
- Lives in gote
- Posts: 311
- Joined: Wed May 31, 2017 6:49 am
- Rank: 2d
- GD Posts: 0
- Been thanked: 45 times
Re: Leela-zero project
I think this is not so simple. It's not just the weights that are in question, but several implementation details. DM probably experimented with quite a few NN structures / parameter tweaks before their network reached the levels published. It seems perfectly possible that even with decent expertise the first few implementations will be inferior to theirs. And while they may not lose much by publishing the weights, the implementation details are a different matter. That knowledge is basically the reason Google invested into RL the first place...Krama wrote:It would probably be too silly of me to expect deepmind to release the weights of their NN.
What would they lose by doing that?
If nothing then why won't they do it?
(BTW, for this reason I think the public/distributed "Zero" project will also need some kind of experimental/development channel, constantly trying new and new implementations, not just training a single net.)