Arnon Rotem-Gal-Oz’s explanation of this fallacy points out two possible interpretations, both of which can be problematic mistakes.
First you can consider the cost of moving data between your application and the network interface. The performance impact of serializing objects or parsing data add a cost above and beyond the limitations of bandwidth and latency. Apple’s 2010 WWDC session 117 “building a server-driven user experience” for their comparison of the size and load time of data in xml, json, and plists (slide #40 & 41 at ~27:30). While the size of each format was similar the time required to parse the result varied tremendously; 812ms for xml, 416ms for json, 140ms for an ascii plist and only 19ms for a binary plist.
Secondly you can consider the cost of maintaining network services and infrastructure. How much load does each user place on your servers and what do they cost to run? How many concurrent requests can your servers actually handle and what does it cost to scale those systems? What does using your app cost your users who live with monthly data limits or pay on a per-kilobyte basis?