YouTube on Wednesday will begin testing a new age-verification system in the U.S. that relies on artificial intelligence to differentiate between adults and minors.
I always secretly hoped one of the big porn companies would step up and start a competing service, since they’re probably the only ones with the necessary infrastructure already in place.
Does it dependent on connection speed, or does it fully buffer before capturing? I find sometimes there’s some sections that will refuse to improve even if going back and replaying sometimes. Though I also wonder if sponsorblock can cut out sponsor segments from the finished product. Guessing it’s either/or?
You mean it’s skipping some of the entries in the actual playlist, or it’s just only getting the first x entries and then stopping?
I haven’t personally found a solution to random skipping to that other than allowing yt-dlp to use your account credentials/cookie to act as though it’s a signed-in user. YouTube is randomly deciding to block either some or all download attempts from non-signed-in clients, like yt-dlp.
If it’s just stopping after x number of videos, I have absolutely no clue.
Aren’t there already basically projects for this? Seems like there’s a local download leftover on a newpipe server when someone plays a video through it?
I know very little about this, but seems like if was a reason for “why running your own newpipe server could be troublesome”
Backup YouTube now. Be sure to embed chapters, metadata and sub titles.
Sure I’ll just put that on my 10 Exabyte drive. On my 100Gbps connection.
Obviously only the channels/specific videos of importance to you. No one is realistically proposing backing up all of YT.
I dream of the day we don’t rely on one host for user content like YouTube. I worry for the history of the content there.
I always secretly hoped one of the big porn companies would step up and start a competing service, since they’re probably the only ones with the necessary infrastructure already in place.
I backed up a few dozen channels on a spare 3TB drive. Planning on making torrents to add to a seedbox later.
What’s the best method for maintaining quality when backing up a YT vid?
yt-dlp usually picks the best flags by default but may need your browser cookies. I usually manually set <=1080p
Does it dependent on connection speed, or does it fully buffer before capturing? I find sometimes there’s some sections that will refuse to improve even if going back and replaying sometimes. Though I also wonder if sponsorblock can cut out sponsor segments from the finished product. Guessing it’s either/or?
I’m working on my music playlist. It’s 1200 videos long and yt-DPP only gets Patr of the list. How can I download the full list?
You mean it’s skipping some of the entries in the actual playlist, or it’s just only getting the first x entries and then stopping?
I haven’t personally found a solution to random skipping to that other than allowing yt-dlp to use your account credentials/cookie to act as though it’s a signed-in user. YouTube is randomly deciding to block either some or all download attempts from non-signed-in clients, like yt-dlp.
If it’s just stopping after x number of videos, I have absolutely no clue.
The first 98/1200 vids, something about api page 1
Aren’t there already basically projects for this? Seems like there’s a local download leftover on a newpipe server when someone plays a video through it?
I know very little about this, but seems like if was a reason for “why running your own newpipe server could be troublesome”
Nothing unified I’m aware of. I figured the YouTube ID thing would push pirate sites to add sections though.
Are you confusing NewPipe for Piped?
Sounds like something I’d do.
Was thinking “was new pipe the right project to reference in that comment” and I don’t think it was.