any newsgroup gurus in here , please help me

hi

i just tried to download a movie from newsgroup via x news; i just tried dl a few rar files to test the water :); i got rar files like this

.part32.rar
part32_0__error.rar
.part33.rar
.part33_0.rar, etc....

so why dont they have the rregular rar r01, etc.. like what we normally see? what's the _0.rar for?

what am i doing with the _0_error.rar files?

Also, i noticed that the files dont have the same sizes as the rarred files should.

Before trying xnews, i tried newsbin pro and it freaked me out, it kept downloading even tho i tried to stop lol; all this newsgroup stuff is confusing as hell :) please shine some light please :)

i do have agent and that looks intimidating too; too many features for this nontechy slow learner :)
 
I think the reason for that is that the files you got got are incomplete since assume part 33 had 128 parts to it and only 126 is there, you will get errors because it needs all 128 parts to make the rar file.
 
Only problem is sometimes they dont post par files with there films so you have to ask for fills or wait for a repost.
 
from what i have read, stuff that's on newsgroups are only current right? is there like an archive that one can go back and search, or you basically just take whatever is available, with no requesting feature allowed?

oh, and so if the poster doesnt post the par files, what other ways can you do to get around it?
 
You can make requests if your news server allows you to post also try looking in the reposts for older files.
 
ubamous3 said:
so what do i do? delete them? i mean, how do you know if they r complete or not before you decide to download them?
You can usually tell prior to flagging them for download... In Xnews since that's what I use... The 100% complete stuff has the 4 light blue blocks... the incomplete is usually just 3 dark blue blocks... That's one reason some news servers are better than others in both retention and completion. retention is how long the article will be there while completion is how complete it is.
 
hey almighty,

i use xnews too (well learning to get the hang of things anyway); i learned about the completenes symbols after i started downloading unfortunately.

the manual that xnewsguy has is not all that comprehensive, but there are a couple other tutorials that are easier to follow, but they dont address this particular problem tho.

These files are what i have from downloading; can you help me make sense of them? Also, can xnews really know to not double download stuff or it'll just download whatever you queue.

XxxDisk 2.part18_1.rar ---A- 16,140,468
XxxDisk 2.part18_1_0.rar ---A- 15,520,624
XxxDisk 2.part23.rar ---A- 24,524
XxxDisk 2.part23_0.rar ---A- 15,520,624
XxxDisk 2.part23_0_0.rar ---A- 15,806,022
XxxDisk 2.part23_1__error_part_48.rar ---A- 12,392,473
XxxDisk 2.part28.rar ---A- 17,100,000


XxxDisk 2.part31.rar ---A- 17,325,000
XxxDisk 2.part32.rar ---A- 17,550,000
XxxDisk 2.part36.rar ---A- 18,000,000
XxxDisk 2.part36_0.rar ---A- 18,000,000
XxxDisk 2.part36_1.rar ---A- 18,000,000
 
ubamous,

I am unaware of which Usenet newsgroup you are attempting to download a file from, but multi-part postings on Usenet Newsgroups are usually sequence numbered (01/05), (02/05) and so forth so that you determine if all the parts of the file are there.

In my experience with binaries newsgroups, I have found that postings are usually multipart and you need to the use the Combine and Decode Function in Outlook Express.

You need to select ALL of the messages that are part of the original message (each part will usually have a number). Use the “Control Key” (Ctrl) and mark/highlight all the parts of the message and then right click the marked/highlighted messages and then in the pop-up menu select “Combine and Decode.”

If you need additional information, I suggest in Outlook Express select “Help,” then “Contents and Index” (F1) and look under the topic “Combining Multipart NewsGroup Messages.”

You can visit Google Deja's Usenet Archive and search the particular Usenet newsgroup archive for the missing files.

h**p://www.google.com/ and look under the heading “Groups” (Usenet Newsgroups) and search the archived files for the particularly Usenet Newsgroup.

Hope this information is helpful and useful.

Regards,
Coaster
 
ubamous3 said:
hey almighty,

i use xnews too (well learning to get the hang of things anyway); i learned about the completenes symbols after i started downloading unfortunately.

the manual that xnewsguy has is not all that comprehensive, but there are a couple other tutorials that are easier to follow, but they dont address this particular problem tho.

These files are what i have from downloading; can you help me make sense of them? Also, can xnews really know to not double download stuff or it'll just download whatever you queue.

XxxDisk 2.part18_1.rar ---A- 16,140,468
XxxDisk 2.part18_1_0.rar ---A- 15,520,624
XxxDisk 2.part23.rar ---A- 24,524
XxxDisk 2.part23_0.rar ---A- 15,520,624
XxxDisk 2.part23_0_0.rar ---A- 15,806,022
XxxDisk 2.part23_1__error_part_48.rar ---A- 12,392,473
XxxDisk 2.part28.rar ---A- 17,100,000


XxxDisk 2.part31.rar ---A- 17,325,000
XxxDisk 2.part32.rar ---A- 17,550,000
XxxDisk 2.part36.rar ---A- 18,000,000
XxxDisk 2.part36_0.rar ---A- 18,000,000
XxxDisk 2.part36_1.rar ---A- 18,000,000
From reading what you wrote, the correct way to download files is to flag each filename by hitting the space bar and then hit F4 to download. All of the ones you listed are incompleted downloads that xnews doesn't know how to resume since the error are the placeholders and the ones with the numbers are usually when it tries to resume but can't so it writes a new file but the problem is with the extra number, xnews is using the wrong file for that part so it can't generate the actual file needed since on the part36 for example, you're supposed to have only one XxxxDisk 2.part36.rar file when successful.
 
Also, you might want to use xnews or Google Groups, www.deja.com and post on the newsgroup news.software.readers where Luu Tran, the author of XNews as well as other Xnews users post.
 
Top