GigaNews problems

Earthlink tore up their own Usenet server farm a long time ago. For customers who were around at that time they have provided Usenet service through Giganews. This has worked well for the most part. But there are problems.

Email notices

It used to be that I would occasionally receive an email from Giganews telling me that I was approaching my data limit. (35GB/month) No hint was ever provided as to when that email was sent out. After some effort I found it to be about 25GB.

Then the emails stopped. They don't even appear in the known SPAM folder. I suspect that they have fallen prey to yet another level of SPAM filtering. At some point the email headers started showing that all incoming email was first passing through servers at vadesecure.net. Giganews had been altering (most would call it forging) the headers so it looked like they were coming from Earthlink. This likely triggers this extra (silent) level of filtering.

Missing article bodies

I have used an old Perl script (aub) to assemble multi-part binary files for over twenty years. It works well enough for my needs. Then a year or three ago it began having trouble.

aub first downloads the headers for new messages and then scans the subject line looking for files to assemble. If it finds all of the parts for a file, it then requests the message bodies using the NNTP BODY command. This began to occasionally return a "430 no such article" error. Very odd considering that the server had just provided the header.

This wasn't a big problem at first because most files had forward error correction (parity files) available. But aub would throw away an entire file when this error occurred so when the frequency of this problem increased, it reached the point where sometimes parity files weren't enough. I dug into the Perl code and modified it so that it would just continue on with what it had. This was sufficient for a while.

I rarely use nzb files but when I do it is with the nzbperl program. This is fairly smart and can recognize that it already has a part of a file and skips downloading it. So I tried running it a second time and sometimes this picked up the missing files. Showing that the problem was not only random but transient. Digging into the docs revealed that it had a keepbroken option that helped even more.

Bogus responses to the GROUP command

More recently I began seeing the occasional bogus response to the NNTP GROUP command. The client sends this to select a particular news group to read. The reply includes an article count and the first and last article numbers. Sometimes the first and last article numbers were given as 2 and 1. Normally when a group has no articles available these are equal so a first greater than last is very odd.

A later attempt would get the correct values. Then I noticed a variant of the bogus reply where the last article number dropped to a lower value. For example:

      211 22714 1636319 1659032 alt.binaries.sounds.mp3.dr_demento
      211 22714 1636319 1659032 alt.binaries.sounds.mp3.dr_demento
      211 22714 1636319 1659032 alt.binaries.sounds.mp3.dr_demento
      211 22714 1636319 1659032 alt.binaries.sounds.mp3.dr_demento
      211 22561 1636319 1658879 alt.binaries.sounds.mp3.dr_demento
      211 22714 1636319 1659032 alt.binaries.sounds.mp3.dr_demento
      211 22561 1636319 1658879 alt.binaries.sounds.mp3.dr_demento
      211 22714 1636319 1659032 alt.binaries.sounds.mp3.dr_demento
  

I finally decided to complain about the problem with the BODY and GROUP commands. Not to Earthlink customer support because they wouldn't have a clue but directly to Giganews. I didn't have much hope that this would result in any reply at all since I don't really have an account with them. But I did get a reply.

After banging my head against the first level customer support for a while, it was finally sent on to the next level. Not that it helped much. I explained that this was an occasional, random, and transient problem. (Both of them.) The problem was referred to their "engineers" (I can sneer at that since I have an MSEE) reported back in less than a day that they couldn't see any trouble. And the problem was declared solved.

Data

Since they couldn't (or wouldn't) find the problem, I decided to start some data collection. (I had suggested that this was the sort of thing they ought to be doing.) I created a program that would log onto the server and issue a single group command. (For the Dr. Demento group.) Then record time stamped results. This was set to run about every 30 minutes on my Raspberry Pi. (Along with its other data collection tasks.) Then it creates a graph using the last article number.

graph

I sent another email to Giganews customer support pointing them to this graph. Maybe it will help. If not, then this page stays up in the hope that it will have some effect.

December 2021

The missing articles problem is worse. To illustrate it I had to use a group a bit more active than for Dr. Demento. (Down to zero articles available.) When I look at an active group (pornstars.80s in this case) I see trouble. If I subscribe to it using Thunderbird I see an interesting phenomena. It will report that it has found some number of new articles (1,000s) but when I let it download the headers, it finds fewer than that number.

So I wrote a program to probe the problem. The program accepts a group name on the command line. It selects that group, looks at the available articles, subtracts some value from the last article number, and issues an XHDR command.

I use XHDR just like aub to get the subject lines. After that the program then goes through the list and gets the full headers one at a time using the HEAD command. If that reports an error then it is printed.

I was expecting to find a mismatch between data returned by XHDR and HEAD. But I also found another problem. Completely missing articles.

Just after new articles appear, some don't seem to exist at all. As an example, shortly after over 6,000 articles were posted I ran this program. It found the last article number, (from the response to the group command) backed up 2,000, and requested the data. This should have returned 2,001 articles. Instead I received 1,792.

Just to verify that it wasn't a problem with my code (the read line routine has been a problem) I used telnet (see below) to verify. In part:

34238530 Sue Nero - 27 clips "(Sue Nero 27).avi.001" yEnc (52/77)
34238531 Sue Nero - 27 clips "(Sue Nero 27).avi.001" yEnc (61/77)
34238532 Sue Nero - 27 clips "(Sue Nero 27).avi.002" yEnc (59/77)
34238535 Sue Nero - 27 clips "(Sue Nero 27).avi.vol54+69.par2" yEnc (14/17)
34238536 Sue Nero - 27 clips "(Sue Nero 27).avi.001" yEnc (54/77)
34238537 Sue Nero - 27 clips "(Sue Nero 27).avi.vol54+69.par2" yEnc (15/17)
34238539 Sue Nero - 27 clips "(Sue Nero 27).avi.003" yEnc (56/66)
34238541 Sue Nero - 27 clips "(Sue Nero 27).avi.001" yEnc (63/77)
  

The number at the beginning of the line is the article number and there should be no gaps. Except that there are four missing here. (I tried using the HEAD command on one of the missing article numbers and got a 423 error.)

It appears that the server is receiving articles, assigning an article number, storing it, then being unable to find it. Until later. Maybe. I tried again 24 hours later with the same result: 1,792 article retrieved instead of 2,001.

5 Jan. 2022

Performance is quite variable. A couple more groups of articles were posted with one seeming to do well but the group today not so much. Thunderbird retrieved fewer than the number of articles it first claimed were available. Running my program resulted in part:

    34264349 The Golden Age of Porn - Giant Juggs "The Golden Age of Porn - Giant Juggs.mkv.vol165+154.par2" yEnc (08/94)
423 no such article in group

34264350 The Golden Age of Porn - Giant Juggs "The Golden Age of Porn - Giant Juggs.mkv.vol077+088.par2" yEnc (15/55)
423 no such article in group

34264351 The Golden Age of Porn - Giant Juggs "The Golden Age of Porn - Giant Juggs.mkv.011" yEnc (102/130)
423 no such article in group


  

A few hours later most of the missing articles had magically appeared:

$ ./missing alt.binaries.erotica.pornstars.80s
group alt.binaries.erotica.pornstars.80s
211 123337 34141185 34264521 alt.binaries.erotica.pornstars.80s
found 2001 headers
34263137 The Golden Age of Porn - Giant Juggs "The Golden Age of Porn - Giant Juggs.mkv.001" yEnc (115/130)
430 no such article
  

Except for the one that returned a 430 instead of a 423 error.

Telnet

If you happen to be using a newsreader that isn't giving you quite the amount of information you desire, you can access a server using telnet. The trick is to specify the correct (NNTP) port: "telnet news.west.earthlink.net 119" in my case. Then provide your login information and type in commands. You will want to have a guide to the NNTP commands handy. Avoid using the ARTICLE or BODY commands in a binary group.

Home