thumbnail of 1598945779853.png
thumbnail of 1598945779853.png
1598945779853 png
(460.36 KB, 1796x1391)
thumbnail of youtube-comment-downloader-master.zip
thumbnail of youtube-comment-downloader-master.zip
youtube-co... zip
(9.76 KB, 0x0)
I found some time to fork and remake the changes in the comment-downloader but properly this time, not just throwing it together in 30 mins. Here are the changes
>  -y now has actually tested and working parsing instead of whatever that old one was, so short and long urls are supported
>  output file is no longer necessary to specify for single downloads, it saves them to the [id].json by default
>  If from-file downloading (-f) is used -o works as directory changer (So it no longer dumps everything to "./")
>  from-file downloading is much more robust now - the resulting filenames are actually converted to normal strings (no \n or $ in resulting filename.json) also it happens automatically so -t was removed
>  limit should work if it was ever needed
>  Added error messages for known errors so when it breaks again due to google we should know
>  Added -a option to save directly to valid JSON array so fixing is no longer needed
Function that fetches video name could be added but I expect this to be used with the files generated by youtube-dl and they already have the name in them so I don't think there is any need for that

 >>/7576/
> If that is not possible or easy would splitting into pages work?
It would have to be splitted anyway so yes, I wanted them to autoload when you scroll lower but we could just link them as well. Now we just have figure out what amount of comments is optimal per page
> They have the content but don't show what it was like or the discourse around it
Exactly what i had in mind

 >>/7580/
I was afraid of that when they kept the maintenance page up for over a week. The only good thing is that the torrents are much more healthy now.
Thankfully the tools were public so I hope someone will pick it up again soon.