
How do tokenized protocols become 'fat' and 'thin'?
I want to understand how tokenized protocols can be classified as 'fat' or 'thin'. What are the criteria or characteristics that determine this classification? How does a protocol transition from being 'thin' to 'fat' or vice versa?
