抒情的森林:我说了相关方不要找我,但私下还是有很多人找。最近几次我都拉黑了,要说就公开说。
宋健:我们25年的收入是550万元,到26年1月份的收入累计快2000万元,今年会更多。,更多细节参见新收录的资料
。业内人士推荐新收录的资料作为进阶阅读
narrowing/signed casts (e.g. I64 -> I8), where truncation/sign,这一点在新收录的资料中也有详细论述
I didn’t train a new model. I didn’t merge weights. I didn’t run a single step of gradient descent. What I did was much weirder: I took an existing 72-billion parameter model, duplicated a particular block of seven of its middle layers, and stitched the result back together. No weight was modified in the process. The model simply got extra copies of the layers it used for thinking?
미국은 미사일이 부족하다? 현대전 바꾼 ‘가성비의 역습’[딥다이브]