@freemor (re: http) Yeah, but yet a lot of sites are preparing the jump to http/2.0 with its inbelievable complexity. And even http/1.1 is probably a little bit over engineered.
matched #a6rdmwa score:0.17
Search by:
Search by 1 mentions:
@freemor (re: http) Yeah, but yet a lot of sites are preparing the jump to http/2.0 with its inbelievable complexity. And even http/1.1 is probably a little bit over engineered.
matched #ov7m4ka score:0.17
Search by:
Search by 1 mentions:
@freemor (re: http) Yeah, but yet a lot of sites are preparing the jump to http/2.0 with its inbelievable complexity. And even http/1.1 is probably a little bit over engineered.
matched #p7qpqxa score:0.17
Search by:
Search by 1 mentions:
…mor (re: timeline) Or we resort the timline to have the newest entries on top, then you could just request the first x bytes of every feed. But archiving is definitly the pragmatic solution. Get back …
matched #bgkedua score:0.16
Search by:
Search by 1 mentions:
…mor (re: timeline) Or we resort the timline to have the newest entries on top, then you could just request the first x bytes of every feed. But archiving is definitly the pragmatic solution. Get back …
matched #rkhdsga score:0.16
Search by:
Search by 1 mentions:
…mor (re: timeline) Or we resort the timline to have the newest entries on top, then you could just request the first x bytes of every feed. But archiving is definitly the pragmatic solution. Get back …
matched #wspy72a score:0.16
Search by:
Search by 1 mentions:
@freemor (re: bandwidth) We tackle that problem when it arises. I'm currently following 60 (mostly dormant) users with 500k for all files with an update taking less than 2s.
matched #gpb4fga score:0.18
Search by:
Search by 1 mentions:
@freemor (re: bandwidth) We tackle that problem when it arises. I'm currently following 60 (mostly dormant) users with 500k for all files with an update taking less than 2s.
matched #i5gxgeq score:0.18
Search by:
Search by 1 mentions: