this post was submitted on 10 Sep 2025
30 points (100.0% liked)

Asklemmy

50410 readers
883 users here now

A loosely moderated place to ask open-ended questions

Search asklemmy ๐Ÿ”

If your post meets the following criteria, it's welcome here!

  1. Open-ended question
  2. Not offensive: at this point, we do not have the bandwidth to moderate overtly political discussions. Assume best intent and be excellent to each other.
  3. Not regarding using or support for Lemmy: context, see the list of support communities and tools for finding communities below
  4. Not ad nauseam inducing: please make sure it is a question that would be new to most members
  5. An actual topic of discussion

Looking for support?

Looking for a community?

~Icon~ ~by~ ~@Double_A@discuss.tchncs.de~

founded 6 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[โ€“] kionite231@lemmy.ca 8 points 4 days ago (2 children)

No, I have tested it on my video on YouTube. Don't know the specific reason

[โ€“] dsilverz@calckey.world 1 points 4 days ago

@kionite231@lemmy.ca @NONE_dc@lemmy.world

As far as I know, there are two different domains in play here: YouTube dot com, and Googlevideo dot com. The first serves the main interface, as well as the API endpoints. The latter serves the stream.

Both deal with geographical distribution (CDN) so the domain solves (via DNS) to a data center as closest as possible to the user (e.g. if I access YouTube, the domains will solve to Google data centers in Sao Paulo).

This regionalization makes it difficult for real-time communication of video statistics, so the view count and other information is often delayed as they're aggregated geographically and later communicated back to their main data center.

That's also why, for example, a video Id isn't sequential (1,2,3,4,...), because it'd require the servers to communicate their machine states in real-time, thus leading to the same (or worse) delays from users accessing the main data center directly, which can be as farther as dozens of thousands of kilometers from the user if the said user is in, say, the middle east, because the main servers are USian and light can go as fast as circa 300.000km/s in vacuum, getting slower if light needs to go through glass, which is the case for optic cables: even though it seems fast, it's actually slow in computing terms because information needs to arrive and go multiple times in order to carry all the network packets.

Then, there's another phenomenon: a video streaming can involve multiple reconnections, as the content is being streamed. This is even noticeable when there are thousands or millions of simultaneous viewers, and the user notices this as buffering delays. If each connection were to count a new view, it'd count the same viewer multiple times, so the view count is done through the main interface instead, through the main domain YouTube dot com. Even when people access the video through the app or through a smart TV, the device will request the YouTube domain which will return information regarding the stream, such as the exact URL for the said video on Googlevideo.

Invidious, as far as I know, uses the main interface to retrieve the streaming information (web scraping, as the official API is restricted in this regard), so it's as if some user were accessing it, so it should count as a view. The new view count isn't instantaneous so that's probably why you didn't see the viewer count going up.