![Denny Zhou on Twitter: "A key point in the chain-of-thought prompting paper: RIP downstream-task finetuning on LLMs." / Twitter Denny Zhou on Twitter: "A key point in the chain-of-thought prompting paper: RIP downstream-task finetuning on LLMs." / Twitter](https://pbs.twimg.com/media/FnW_NM0XEAAfcbO.jpg)
Denny Zhou on Twitter: "A key point in the chain-of-thought prompting paper: RIP downstream-task finetuning on LLMs." / Twitter
Language: Train of thought vs. chain of thought. Which is older and which more popular? Do their usages differ in terms of formality etc.? - Quora
![Jason Wei on Twitter: "Turns out the ancient Chinese knew a lot about modern neural networks https://t.co/6ceD7wtUGj" / Twitter Jason Wei on Twitter: "Turns out the ancient Chinese knew a lot about modern neural networks https://t.co/6ceD7wtUGj" / Twitter](https://pbs.twimg.com/media/FvYdwcgaAAkrD0V.png)