1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 132 133 134 135 136 137 138 139 140 141 142 143 144 145 146 147 148 149 150 151 152 153 154 155 156 157 158 159 160 161 162 163 164 165 166 167 168 169 170 171 172 173 174 175 176 177 178 179 180 181 182 183 184 185 186 187 188 189 190 191 192 193 194 195 196 197 198 199 200 201 202 203 204 205 206 207 208 209 210 211 212 213 214 215 216 217 218 219 220 221 222 223 224 225 226 227 228 229 230 231 232 233 234 235 236 237 238 239 240 241 242 243 244 245 246 247 248 249 250 251 252 253 254 255 256 257 258 259 260 261 262 263 264 265 266 267 268 269 270 271 272 273 274 275 276 277 278 279 280 281 282 283 284 285 286 287 288 289 290 291 292 293 294 295 296 297 298 299 300 301 302 303 304 305 306 307 308 309 310 311 312 313 314 315 316 317 318 319 320 321 322 323 324 325 326 327 328 329 330 331 332 333 334 335 336 337 338 339 340 341 342 343 344 345 346 347 348 349 350 351 352 353 354 355 356 357 358 359 360 361 362 363 364 365 366 367 368 369 370 371 372 373 374 375 376 377 378 379 380 381 382 383 384 385 386 387 388 389 390 391 392 393 394 395 396 397 398 399 400 401 402 403 404 405 406 407 408 409 410 411 412 413 414 415 416 417 418 419 420 421 422 423 424 425 426 427 428 429 430 431 432 433 434 435 436 437 438 439 440 441 442 443 444 445 446 447 448 449 450 451 452 453 454 455 456 457 458 459 460 461 462 463 464 465 466 467 468 469 470 471 472 473 474 475 476 477 478 479 480 481 482 483 484 485 486 487 488 489 490 491 492 493 494 495 496 497 498 499 500 501 502 503 504 505 506 507 508 509 510 511 512 513 514 515 516 517 518 519 520 521 522 523 524 525 526 527 528 529 530 531 532 533 534 535 536 537 538 539 540 541 542 543 544 545 546 547 548 549 550 551 552 553 554 555 556 557 558 559 560 561 562 563 564 565 566 567 568 569 570 571 572 573 574 575 576 577 578 579 580 581 582 583 584 585 586 587 588 589 590 591 592 593 594 595 596 597 598 599 600 601 602 603 604 605 606 607 608 609 610 611 612 613 614 615 616 617 618 619 620 621 622 623 624 625 626 627 628 629 630 631 632 633 634 635 636 637 638 639 640 641 642 643 644 645 646 647 648 649 650 651 652 653 654 655 656 657 658 659 660 661 662 663 664 665 666 667 668 669 670 671 672 673 674 675 676 677 678 679 680 681 682 683 684 685 686 687 688 689 690 691 692 693 694 695 696 697 698 699 700 701 702 703 704 705 706 707 708 709 710 711 712 713 714 715 716 717 718 719 720 721 722 723 724 725 726 727 728 729 730 731 732 733 734 735 736 737 738 739 740 741 742 743 744 745 746 747 748 749 750 751 752 753 754 755 756 757 758 759 760 761 762 763 764 765 766 767 768 769 770 771 772 773 774 775 776 777 778 779 780 781 782 783 784 785 786 787 788 789 790 791 792 793 794 795 796 797 798 799 800 801 802 803 804 805 806 807 808 809 810 811 812 813 814 815 816 817 818 819 820 821 822 823 824 825 826 827 828 829 830 831 832 833 834 835 836 837 838 839 840 841 842 843 844 845 846 847 848 849 850 851 852 853 854 855 856 857 858 859 860 861 862 863 864 865 866 867 868 869 870 871 872 873 874 875 876 877 878 879 880 881 882 883 884 885 886 887 888 889 890 891 892 893 894 895 896 897 898 899 900 901 902 903 904 905 906 907 908 909 910 911 912 913 914 915 916 917 918 919 920 921 922 923 924 925 926 927 928 929 930 931 932 933 934 935 936 937 938 939 940 941 942 943 944 945 946 947 948 949 950 951 952 953 954 955 956 957 958 959 960 961 962 963 964 965 966 967 968 969 970 971 972 973 974 975 976 977 978 979 980 981 982 983 984 985 986 987 988 989 990 991 992 993 994 995 996 997 998 999 1000 1001 1002 1003 1004 1005 1006 1007 1008 1009 1010 1011 1012 1013 1014 1015 1016 1017 1018 1019 1020 1021 1022 1023 1024 1025 1026 1027 1028 1029 1030 1031 1032 1033 1034 1035 1036 1037 1038 1039 1040 1041 1042 1043 1044 1045 1046 1047 1048 1049 1050 1051 1052 1053 1054 1055 1056 1057 1058 1059 1060 1061 1062 1063 1064 1065 1066 1067 1068 1069 1070 1071 1072 1073 1074 1075 1076 1077 1078 1079 1080 1081 1082 1083 1084 1085 1086 1087 1088 1089 1090 1091 1092 1093 1094 1095 1096 1097 1098 1099 1100 1101 1102 1103 1104 1105 1106 1107 1108 1109 1110 1111 1112 1113 1114 1115 1116 1117 1118 1119 1120 1121 1122 1123 1124 1125 1126 1127 1128 1129 1130 1131 1132 1133 1134 1135 1136 1137 1138 1139 1140 1141 1142 1143 1144 1145 1146 1147 1148 1149 1150 1151 1152 1153 1154 1155 1156 1157 1158 1159 1160 1161 1162 1163 1164 1165 1166 1167 1168 1169 1170 1171 1172 1173 1174 1175 1176 1177 1178 1179 1180 1181 1182 1183 1184 1185 1186 1187 1188 1189 1190 1191 1192 1193 1194 1195 1196 1197 1198 1199 1200 1201 1202 1203 1204 1205 1206 1207 1208 1209 1210 1211 1212 1213 1214 1215 1216 1217 1218 1219 1220 1221 1222 1223 1224 1225 1226 1227 1228 1229 1230 1231 1232 1233 1234 1235 1236 1237 1238 1239 1240 1241 1242 1243 1244 1245 1246 1247 1248 1249 1250 1251 1252 1253 1254 1255 1256 1257 1258 1259 1260 1261 1262 1263 1264 1265 1266 1267 1268 1269 1270 1271 1272 1273 1274 1275 1276 1277 1278 1279 1280 1281 1282 1283 1284 1285 1286 1287 1288 1289 1290 1291 1292 1293 1294 1295 1296 1297 1298 1299 1300 1301 1302 1303 1304 1305 1306 1307 1308 1309 1310 1311 1312 1313 1314 1315 1316 1317 1318 1319 1320 1321 1322 1323 1324 1325 1326 1327 1328 1329 1330 1331 1332 1333 1334 1335 1336 1337 1338 1339 1340 1341 1342 1343 1344 1345 1346 1347 1348 1349 1350 1351 1352 1353 1354 1355 1356 1357 1358 1359 1360 1361 1362 1363 1364 1365 1366 1367 1368 1369 1370 1371 1372 1373 1374 1375 1376 1377 1378 1379 1380 1381 1382 1383 1384 1385 1386 1387 1388 1389 1390 1391 1392 1393 1394 1395 1396 1397 1398 1399 1400 1401 1402 1403 1404 1405 1406 1407 1408
|
.\" $Revision: 4.2.0 $
.TH SUCK 1
.SH NAME
suck - Pull a small newsfeed from an NNTP server, avoiding the NEWNEWS command.
.SH SYNOPSIS
.I suck
[
.BI
hostname
]
[
.BI @filename
]
[
.BI \-V
]
[
.BI \-K
]
[
.BI \-L[SL]
]
[
.Bi \-LF
filename
]
[
.BI \-H
]
[
.BI \-HC
]
[
.BI \-HF
filename
]
[
.BI \-HI
]
[
.BI \-HN
]
[
.BI \-HO
]
[
.BI \-d[tmd]
dirname
]
[
.BI \-s\ |\ \-S
filename
]
[
.BI \-e\ |\ \-E
filename
]
[
.BI \-a
]
[
.BI \-m
]
[
.BI \-b[irlf]
batchfile
]
[
.BI \-r
filesize
]
[
.BI \-p
extension
]
[
.BI \-U
userid
]
[
.BI \-P
password
]
[
.BI \-Q
]
[
.BI \-c
]
[
.BI \-M
]
[
.BI \-N
port_number
]
[
.BI \-W
pause_time pause_nr_msgs
]
[
.BI \-w
pause_time pause_nr_msgs
]
[
.BI \-l
phrase_file
]
[
.BI \-D
]
[
.BI \-R
]
[
.BI \-q
]
[
.BI \-C
count
]
[
.BI \-k
]
[
.BI \-A
]
[
.BI \-AL
activefile
]
[
.BI \-hl
localhost
]
[
.BI \-bp
]
[
.BI \-T
timeout
]
[
.BI \-n
]
[
.BI \-u
]
[
.BI \-z
]
[
.BI \-x
]
[
.BI \-B
]
[
.BI \-O
]
[
.BI \-G
]
[
.BI \-X
]
[
.BI \-f
]
[
.BI \-y
post_filter
]
[
.BI \-F
]
[
.BI \-g
]
[
.BI \-i
number_to_read
]
[
.BI \-Z
]
[
.BI \-rc
]
[
.BI \-lr
]
[
.BI \-sg
]
[
.BI \-ssl
]
[
.BI \-SSL
]
Options valid in all modes
\hostname
The hostname may optionally include the port number, in the form
.BI Host:Port. If this option is used, any port number specified
via the -N option is ignored.
\@filename
This option tells suck to read other options from a file in addition to the
commandline.
\-a
This option forces suck to always batch up any downloaded articles,
even if suck aborts for any reason. Without this option, suck will
only batch up articles if it finishes successfully or is cancelled by
a signal (see below).
\-A
This option tells suck to scan the localhost (specified with the \-hl option) and use its active file
to build and update the sucknewsrc. If you add a group to your local server, suck will add it to
sucknewsrc and download articles. Or, if you delete a group from your local server, it will be deleted
from sucknewsrc. If posting is not allowed to a particular group, then the line in sucknewsrc is
just commented out. With this option, you should never have to edit your sucknewsrc. In case you have
newsgroups (like control and junk) that you don't want downloaded, you can put these newsgroups in a
file "active-ignore", one per line, and suck will ignore these newsgroups when it scans the localhost.
If your system supports regex(), you may use regular expressions in the active-ignore file to skip multiple groups, eg: fred\.*.
If you use the -p (postfix) option, suck will check for the existence of an active-ignore file with the
postfix. If that doesn't exist, then suck will check for the existence of the file without the postfix.
NOTE: If the localhost is on a non-standard port, the port number may be specified as part of the hostname,
in the form
.BI Host:Port.
NOTE: If you use regular expressions, suck will silently add a "^" to the beginning of the group name,
and a "$" to the end of the group name if they aren't already present, so that if you have "comp.os.linux",
it won't match "comp.os.linux.answers" or if you have "alt.test" it doesn't match "comp.alt.test".
\-AL activefile
This option is identical to the -A option, except it reads the active file from the local file specified instead of
reading it from the localhost. All the caveats from the -A option apply to this option as well. If both
options are used on the command line, suck first tries to use the -A option, then if that fails it uses
this option.
\-B
This option tells suck to attempt to batch up any articles in its directory
BEFORE starting to download messages. This can be useful if you have a
problem with the previous download. This option will only work if you specify
a batch option (see below). If there are no messages to batch up, some
of the batch options may produce warning messages. They may be safely ignored.
Also, if the batch files exist at the end of the run, in inn-batch mode, it
will be overwritten, since the new batch file will contain all messages. In
rnews mode, if the batch file exists, it will abort and not batch up any messages.
\-c
If this option is specified, suck will clean up after itself. This includes:
.RS
1. Moving sucknewsrc to sucknewsrc.old
.RE
.RS
2. Moving suck.newrc to sucknewsrc
.RE
.RS
3. rm suck.sorted and suckothermsgs.
.RE
\-C count
This option tells suck to drop the connection and reopen it every count number of articles.
This is designed to battle INN's LIKE_PULLERS=DONT option, that some folks compile in. With
LIKE_PULLERS=DONT, after 100 messages INN will pause between every message, dramatically
reducing your download speed. I don't recommend the use of this, but if you have no other choice....
\-dd dirname
\-dm dirname
\-dt dirname
Specify the location of the various files used by suck.
\-dd dirname = directory of data files used by suck (sucknewsrc suckkillfile suckothermsgs active-ignore sucknodownload)
\-dm dirname = directory for storage of articles created in Multifile mode
or batch mode. DO NOT make this the same as the directories used for the
\-dt or -\dd options, or you will lose all your configuration files.
\-dt dirname = directory of temp files created by suck (suck.newrc, suck.sort, suck.restart, suck.killlog, suck.post).
\-D
This option tells suck to log various debugging messages to "debug.suck", primarily
for use by the maintainer.
\-e | \-E filename
These options will send all error messages (normally displayed on stderr), to
an alternate file. The lower case version, -e, will send the error messages
to the compiled-in default defined in suck_config.h. The default is suck.errlog.
The upper case version, -E, requires the filename parameter. All error messages
will then be sent to this file.
\-f
This option tells suck to reconnect after deduping, and before downloading the articles. This is in case
long dedupe times cause timeouts on the remote end.
\-F
This option tells suck to reconnect after reading the local active file, and before downloading the Msg-IDs.
This is in case of a large active file, which causes timeouts on the remote end.
\-g
This option causes suck to only download the headers of any selected articles.
As a result of this, any batching of articles is skipped. This option does
work with killfiles, however, killfile options such as BODYSIZE> will be
ignored, since the body of the article will never be downloaded.
\-G
This option causes suck to display the message count and BPS status lines in a slightly different format,
more suitable for use by a filter program (such as a GUI).
\-H
This option will cause suck to bypass the history check.
\-HC
Use DBZ for history file index, cnews or inn 1.
\-HF history_file_name
This option tells suck the location of the history file. The default
is at /var/lib/news/history.
\-HI
Use inn 2.4 history file index.
\-hl localhost
This option specifies the localhost name. This option is required with both the \-A and the \-bp option.
\-HN
No history file. (Same as -H)
\-HO
Use history file with no index.
\-i number_to_read
This option tells suck the number of articles to download if you are using the -A
or -AL option, and a new group is added. The default is defined in suck_config.h (ACTIVE_DEFAULT_LASTREAD, currently -100). NOTE: This must be a negative
number (eg -100, -50), or 0, to download all articles currently available in
the group.
\-k
This option tells suck to NOT attach the postfix from the \-p option to the names of the killfiles,
both the master killfile and any group files. This allows you to maintain one set of killfiles for
multiple servers.
\-K
This option will cause suck to bypass checking the killfile(s).
\-l phrase_file
This option tells suck to load in an alternate phrase file, instead of using
the built-in messages. This allows you to have suck print phrases in another
language, or to allow you to customize the messages without re-building suck.
See below.
\-lr
This option, is used in conjunction with the highest article option in the sucknewsrc, to
download the oldest articles, vice the newest articles. See that section for more details.
\-L
This option tells suck to NOT log killed articles to suck.killlog.
\-LF filename
This option allows you to override the built-in default of "suck.killlog" for the
file which contains the log entries for killed articles.
\-LL
This option tells suck to create long log entries for each killed article. The long
entry contains the short log entry and the header for the killed message.
\-LS
This option tells suck to create short log entries for each killed article. The short
entry contains which group and which pattern was matched, as well as the MsgID of the
killed article.
\-M
This option tells suck to send the "mode reader" command to the remote
server. If you get an invalid command message immediately
after the welcome announcement, then try this option.
\-n
This option tells suck to use the article number vice the MsgId to retrieve the articles. This
option is supposedly less harsh on the remote server. It can also eliminate problems if your
ISP ages off articles quickly and you frequently get "article not found" errors.
Also, if your ISP uses DNEWS, you might need this option so that it knows you're reading articles in a group.
\-N port_number
This option tells suck to use an alternate NNRP port number when connecting
to the host, instead of the default, 119.
\-O
This option tells suck to skip the first article upon restart. This is used whenever
there is a problem with an article on the remote server. For some reasons, some
NNTP servers, when they have a problem with a particular article, they time out.
Yet, when you restart, you're back on the same article, and you time out again.
This option tells suck to skip the first article upon restart, so that you can
get the rest of the articles.
\-p extension
This extension is added to all files so that you can have multiple site feeds.
For example, if you specify -p .dummy, then suck looks for sucknewsrc.dummy, suckkillfile.dummy,
etc, and creates its temp files with the same extension. This will allow you to keep
multiple sucknewsrc files, one for each site.
\-q
This option tells suck to not display the BPS and article count messages during download.
Handy when running suck unattended, such as from a crontab.
\-R
This option tells suck to skip a rescan of the remote newserver upon a restart. The
default is to rescan the newserver for any new articles whenever suck runs, including
restarts.
\-rc
This option tells suck to change its behavior when the remote server resets its article
counters. The default behavior is to reset the lastread in sucknewsrc to the current
high article counter. With this option, suck resets the lastread in sucknewsrc to the
current low article counter, causing it to suck all articles in the group, and using
the historydb routines to dedupe existing articles.
\-s | \-S filename
These options will send all status messages (normally displayed on stdout), to
an alternate file. The lower case version, -s, will send the status messages
to the compiled-in default defined in suck_config.h. The default is /dev/null,
so no status messages will be displayed. The upper case version, -S, requires
the filename parameter. All status messages will then be sent to this file.
\-sg
This option tells suck to add the name of the current group being downloaded, if known,
to the BPS display. Typically the only time suck doesn't know the group name is if
an article is downloaded via the suckothermsgs file.
\-ssl
This option tells suck to use SSL to talk to the remote server, if suck was compiled with
SSL support.
\-SSL
This option tells suck to use SSL to talk to the local server, if suck was compiled with
SSL support.
\-T timeout
This option overrides the compiled-in TIMEOUT value. This is how long suck waits for data from the
remote host before timing out and aborting. The timeout value is in seconds.
\-u
This option tells suck to send the AUTHINFO USER command immediately upon connect to the
remote server, rather than wait for a request for authorization. You must supply the
\-U and \-P options when you use this option.
\-U userid
\-P password
These two options let you specify a userid and password, if your NNTP server
requires them.
\-Q
This option tells suck to get the userid and password for NNTP authentication from
the environment variables "NNTP_USER" and "NNTP_PASS" vice the -U or -P password.
This prevents a potential security problem where someone doing a ps command can
see your userid and password.
\-V
This option will cause suck to print out the version number and then exit.
\-w pause_timer pause_nr_msgs
This option allows you to slow down suck while pulling articles. If you
send suck a predefined signal (default SIGUSR1, see suck_config.h),
suck will swap the default pause options (if specified by the -W option),
with the values from this option. For example, you run suck with -w 2 2,
and you send suck a SIGUSR1 (using kill), suck will then pause 2 seconds
between every other message, allowing the server to "catch its breath."
If you send suck another SIGUSR1, then suck will put back the default
pause options. If no pause options were specified on the command line
(you omitted -W), then suck will return to the default full speed pull.
\-W pause_time pause_nr_msgs
This option tells suck to pause between the download of articles. You need
to specify how long to pause (in seconds), and how often to pause (every X nr
of articles). Ex: \-W 10 100 would cause suck to pause for 10 seconds every
100 articles. Why would you want to do this? Suck can cause heavy loads on
a remote server, and this pause allows the server to "catch its breath."
\-x
This option tells suck to not check the Message-IDs for the ending > character. This option
is for brain dead NNTP servers that truncate the XHDR information at 72 characters.
\-X
This option tells suck to bypass the XOVER killfiles.
\-y post_filter
This option is only valid when using any of batch modes. It allows you to edit any or all of
the articles downloaded before posting to the local host. See below for more details.
\-z
This option tells suck to bypass the normal deduping process. This is primarily for
slow machines where the deduping takes longer than the download of messages would. Not
recommended.
\-Z
This option tells suck to use the XOVER command vice the XHDR command to retrieve the
information needed to download articles. Use this if your remote news server doesn't
support the XHDR command.
.SH LONG OPTION EQUIVALENTS
.RS
\-a \-\-always_batch
.RE
.RS
\-bi \-\-batch-inn
.RE
.RS
\-br \-\-batch_rnews
.RE
.RS
\-bl \-\-batch_lmove
.RE
.RS
\-bf \-\-batch_innfeed
.RE
.RS
\-bp \-\-batch_post
.RE
.RS
\-c \-\-cleanup
.RE
.RS
\-dt \-\-dir_temp
.RE
.RS
\-dd \-\-dir_data
.RE
.RS
\-dm \-\-dir_msgs
.RE
.RS
\-e \-\-def_error_log
.RE
.RS
\-f \-\-reconnect_dedupe
.RE
.RS
\-g \-\-header_only
.RE
.RS
\-h \-\-host
.RE
.RS
\-hl \-\-localhost
.RE
.RS
\-k \-\-kill_no_postfix
.RE
.RS
\-l \-\-language_file
.RE
.RS
\-lr \-\-low_read
.RE
.RS
\-m \-\-multifile
.RE
.RS
\-n \-\-number_mode
.RE
.RS
\-p \-\-postfix
.RE
.RS
\-q \-\-quiet
.RE
.RS
\-r \-\-rnews_size
.RE
.RS
\-rc \-\-resetcounter
.RE
.RS
\-s \-\-def_status_log
.RE
.RS
\-sg \-\-show_group
.RE
.RS
\-ssl \-\-use_ssl
.RE
.RS
\-w \-\-wait_signal
.RE
.RS
\-x \-\-no_chk_msgid
.RE
.RS
\-y \-\-post_filter
.RE
.RS
\-z \-\-no_dedupe
.RE
.RS
\-A \-\-active
.RE
.RS
\-AL \-\-read_active
.RS
.RE
\-B \-\-pre-batch
.RE
.RS
\-C \-\-reconnect
.RE
.RS
\-D \-\-debug
.RE
.RS
\-E \-\-error_log
.RE
.RS
\-G \-\-use_gui
.RE
.RS
\-H \-\-no_history
.RE
.RS
\-HF \-\-history_file
.RE
.RS
\-K \-\-killfile
.RE
.RS
\-L \-\-kill_log_none
.RE
.RS
\-LS \-\-kill_log_short
.RE
.RS
\-LL \-\-kill_log_long
.RE
.RS
\-M \-\-mode_reader
.RE
.RS
\-N \-\-portnr
.RE
.RS
\-O \-\-skip_on_restart
.RE
.RS
\-P \-\-password
.RE
.RS
\-Q \-\-password_env
.RE
.RS
\-R \-\-no_rescan
.RE
.RS
\-S \-\-status_log
.RE
.RS
\-SSL \-\-local_use_ssl
.RS
\-T \-\-timeout
.RE
.RS
\-U \-\-userid
.RE
.RS
\-V \-\-version
.RE
.RS
\-W \-\-wait
.RE
.RS
\-X \-\-no_xover
.RE
.RS
\-Z \-\-use_xover
.RE
.SH DESCRIPTION
.SH MODE 1 \- stdout mode
.RS
%suck
.RE
.RS
%suck myhost.com
.RE
.PP
Suck grabs news from an NNTP server and sends the articles to
stdout. Suck accepts as argument the name of an NNTP server or
if you don't give an argument it will take the environment variable
NNTPSERVER. You can redirect the articles to a file or compress them
on the fly like "suck server.domain | gzip \-9 > output.gz".
Now it's up to you what you do with the articles. Maybe
you have the output already on your local machine because you
used a slip line or you still have to transfer the output to your
local machine.
.SH MODE 2 \- Multifile mode
.RS
%suck \-m
.RE
.RS
%suck myhost.com \-m
.RE
.PP
Suck grabs news from an NNTP server and stores each article in a
separate file. They are stored in the directory specified in suck_config.h or
by the \-dm command line option.
.SH MODE 3 \- Batch mode
.RS
%suck myhost.com \-b[irlf] batchfile
.RE
.RS
or %suck myhost.com \-bp -hl localhost
.RE
.RS
or %suck myhost.com \-bP NR -hl localhost
.RE
.RS
%suck myhost.com \-b[irlf] batchfile
.RE
.PP
Suck will grab news articles from an NNTP server and store them
into files, one for each article (Multifile mode). The location of the files
is based on the defines in suck_config.h and the command line \-dm.
Once suck is done downloading the articles, it will build a batch file
which can be processed by either innxmit or rnews, or it will call lmove
to put the files directly into the news/group/number format.
\-bi \- build batch file for innxmit. The articles are left intact,
and a batchfile is built with a one\-up listing of the full path of each article.
Then innxmit can be called:
.RS
%innxmit localhost batchfile
.RE
\-bl \- suck will call lmove to put the articles into
news/group/number format. You must provide the name of the
configuration file on the command line. The following arguments from suck
are passed to lmove:
.RS
The configuration file name (the batchfile name provided with this option)
.RE
.RS
The directory specified for articles (-dm or built-in default).
.RE
.RS
The errorlog to log errors to (-e or -E), if provided on the command line.
.RE
.RS
The phrases file (-l), if provided on the command line.
.RE
.RS
The Debug option, if provided on the command line.
.RE
\-br \- build batch file for rnews. The articles are
concatenated together, with the #!rnews size
article separator. This can the be fed to rnews:
.RS
%rnews \-S localhost batchfile
.RE
\-r filesize specify maximum batch file size for rnews. This option
allows you to specify the maximum size of a batch file to be fed to rnews.
When this limit is reached, a new batch file is created AFTER I finish
writing the current article to the old batch file. The second and
successive batch files get a 1 up sequence number attached to the
file name specified with the -br. Note that since I have to finish
writing out the current article after reaching the limit, the
max file size is only approximate.
\-bf \- build a batch file for innfeed. This batchfile contains the
MsgID and full path of each article. The main difference between this
and the innxmit option is that the innfeed file is built as the articles
are downloaded, so that innfeed can be posting the articles, even while
more articles are downloaded.
\-bp \- This option tells suck to build a batch file, and post the articles
in that batchfile to the localhost (specified with the \-hl option). This option
uses the IHAVE command to post all downloaded articles to the local host.
The batch file is called suck.post, and is put in the temporary directory (-dt).
It is deleted upon completion, as are the successfully posted articles.
If the article is not wanted by the server (usually because it already exists on
the server, or it is too old), the article is also deleted. If other errors
occur, the article is NOT deleted.
With the following command line, you can download and post articles without
worrying if you are using INND or CNEWS.
.RS
%suck news.server.com -bp -hl localhost -A -c
.RE
\-bP NR \- This option works identically to \-bp above, except instead of
waiting until all articles are downloaded, it will post them to the local
server after downloading NR of articles.
.RS
%suck news.server.com -bP 100 -hl localhost -A -c
.RE
.SH SUCK ARGUMENT FILE
.PP
If you specify @filename on the command line, suck will read from filename and
parse it for any arguments that you wish to pass to suck. You specify the
same arguments in this file as you do on the command line. The arguments
can be on one line, or spread out among more than one line. You may also
use comments. Comments begin with '#' and go to the end of a line. All
command line arguments override arguments in the file.
.RS
# Sample Argument file
.RE
.RS
-bi batch # batch file option
.RE
.RS
-M # use mode reader option
.RE
.SH SUCKNEWSRC
.PP
Suck looks for a file
.I sucknewsrc
to see what articles you want and
which you already received. The format of sucknewsrc is very simple. It
consists of one line for each newsgroup. The line contains two or
three fields.
The first field is the name of the group.
The second field is the highest article number that was in the group
when that group was last downloaded.
The third field, which is optional, limits the number of articles which
can be downloaded at any given time. If there are more articles than this
number, only the newest are downloaded. If the third field is 0, then
no new messages are downloaded. If the command line option \-lr is specified,
instead of downloading the newest articles, suck will download the oldest
articles instead.
The fields are separated by a space.
.RS
comp.os.linux.announce 1 [ 100 ]
.RE
.PP
When suck is finished, it creates the file suck.newrc which contains the
new sucknewsrc with the updated article numbers.
.PP
To add a new newsgroup, just stick it in sucknewsrc, with a
highest article number of \-1 (or any number less than 0).
Suck will then get the newest X number of messages for that newsgroup.
For example, a -100 would cause suck to download the newest 100
articles for that newsgroup.
.PP
To tell suck to skip a newsgroup, put a # as the first
character of a line.
.SH SUCKKILLFILE and SUCKXOVER
There are two types of killfiles supported in suck. The first, via
the file suckkillfile, kills articles based on information in the
actual article header or body. The second, via the file suckxover,
kills articles based on the information retreived via the NNTP command
XOVER. They are implemented in two fundamentally different ways. The
suckkillfile killing is done as the articles are downloaded, one at a
time. The XOVER killing is done while suck is getting the list of articles
to download, and before a single article is downloaded. You may use
either, none or both type of killfiles.
.SH SUCKKILLFILE and GROUP KEEP/KILLFILES
If
.I suckkillfile
exists, the headers of all articles will be scanned and the article downloaded or not,
based on the parameters in the files. If no logging option is specified (see the -L options
above), then the long logging option is used.
.PP
Comments lines are allowed in the killfiles. A comment line has a "#" in the first position.
Everything on a comment line is ignored.
.PP
Here's how the whole keep/delete package works. All articles are checked against the
master kill file (suckkillfile). If an article is not killed by the master kill file,
then its group line is parsed. If a group file exists for one of the groups then the
article is checked against that group file. If it matches a keep file, then it is
kept, otherwise it is flagged for deletion. If it matches a delete file, then it is
flagged for deletion, otherwise it is kept. This is done for every group on the group line.
.PP
NOTES: With the exception of the USE_EXTENDED_REGEX parameter, none of these parameters are
passed from the master killfile to the individual group file. Each killfile is separate
and independant. Also, each search is case-insensitive unless specifically specified by starting the
search string with the QUOTE character (see below). However, the parameter part of the
search expression (the LOWLINE=, HILINE= part) is case sensitive.
.SH
PARAMETERS
.RS
LOWLINES=#######
.RE
.RS
HILINES=#######
.RE
.RS
NRGRPS=####
.RE
.RS
NRXREF=####
.RE
.RS
QUOTE=c
.RE
.RS
NON_REGEX=c
.RE
.RS
GROUP=keep groupname filename OR
GROUP=delete groupname filename
.RE
.RS
PROGRAM=pathname
.RE
.RS
PERL=pathname
.RE
.RS
TIEBREAKER_DELETE
.RE
.RS
GROUP_OVERRIDE_MASTER
.RE
.RS
USE_EXTENDED_REGEX
.RE
.RS
XOVER_LOG_LONG
.RE
.RS
HEADER:
.RE
.RS
Any Valid Header Line:
.RE
.RS
BODY:
.RE
.RS
BODYSIZE>
.RE
.RS
BODYSIZE<
.RE
.PP
All parameters are valid in both the master kill file and the group files, with the
exception of GROUP, PROGRAM, PERL, TIEBREAKER_DELETE, and GROUP_OVERRIDE_MASTER.
These are only valid in the master kill file.
.SH KILL/KEEP Files Parameters
.PP
.I HILINES=
Match any article longer than the number of lines specified.
.PP
.I LOWLINES=
Match any article shorter than the number of lines specified.
.PP
.I NRGRPS=
This line will match any article which has more groups than the number specified
on the Newsgroups: line.
Typically this is used in a killfile to prevent spammed articles.
(A spammed article is one that is posted to many many groups, such
as those get-rich quick schemes, etc.)
.PP
.I NRXREF=
This line will match any article that has more groups than than the number specified
on the Xref: line. This is another spamm stopper. WARNING: the Xref: line is not
as accurate as the Newsgroups: line, as it only contains groups known to the news
server. This option is most useful in an xover killfile, as in Xoverviews don't
typically provide the Newsgroups: line, but do provide the Xref: line.
.PP
.I HEADER:
.I Any Valid Header Line:
Suck allows you to scan any single header line for a particular pattern/string, or
you may scan the entire article header. To scan an individual line, just specify
it, for example to scan the From line for boby@pixi.com, you would put
.RS
From:boby@pixi.com
.RE
Note that the header line EXACTLY matches what is contained in the article. To scan
the Followup-To: line, simply put \"Followup-To:\" as the parameter.
To search the same header line for multiple search items, then each search
item must be on a separate line, eg:
.RS
From:boby@xxx
.RE
.RS
From:nerd@yyy
.RE
.RS
Subject:suck
.RE
.RS
Subject:help
.RE
The parameter HEADER: is a special case of the above. If you use the HEADER: parameter,
then the entire header is searched for the item. You are allowed multiple HEADER: lines
in each killfile.
.PP
When suck searches for the pattern, it only searches for what follows
the :, and spaces following the : are significant. With the above example "Subject:suck",
we will search the Subject header line for the string "suck". If the example had read "Subject: suck",
suck would have searched for the string " suck". Note the extra space.
.PP
If your system has regex() routines on it, then the items searched for can be POSIX
regular expressions, instead of just strings. Note that the QUOTE= option is still
applied, even to regular expressions.
.PP
.I BODY:
This parameter allows you to search the body of an article for text. Again,
if your system has regex(), you can use regular expressions, and the QUOTE= option is
also applied. You are allowed multiple BODY: lines in each killfile.
WARNING: Certain regex combinations, especially with .* at the beginning,
(eg BODY:.*jpg), in combination with large articles, can cause the regex code
to eat massive amounts of CPU, and suck will seem like it is doing nothing.
.PP
.I BODYSIZE>
This parameter will match an article if the size of its body (not including the
header) is greater than this parameter. The size is specified in bytes.
.PP
.I BODYSIZE<
This parameter will match an article if the size of its body, is less than this parameter.
The size is specified in bytes.
.PP
.I QUOTE=
This item specifies the character that defines a quoted string. The default
for this is a ". If an item starts with the QUOTE character, then the item is
checked as-is (case significant). If an item does not start with the QUOTE character,
then the item is checked with out regard to case.
.PP
.I NON_REGEX=
This items specifies the character that defines a non-regex string. The default
for this is a %. If an item starts with the NON_REGEX character, then the item
is never checked for regular expressions. If the item doesn't start with the QUOTE
character, then suck tries to determine if it is a regular expression, and if it
is, use regex() on it. This item is so that you can tell suck to treat strings
like "$$$$ MONEY $$$$" as non-regex items. IF YOU USE BOTH QUOTE and NON_REGEX
characters on a string, the NON_REGEX character MUST appear first.
.PP
.I GROUP=
This line allows you to specify either keep or delete parameters on a group
by group basis. There are three parts to this line. Each part of this line
must be separated by exactly one space. The first part is either
"keep" or "delete". If it is keep, then only articles in that group which match
the parameters in the group file are downloaded. If it is delete, articles in that
group which match the parameters are not downloaded. The second part, the group name
is the full group name for articles to check against the group file. The group name
may contain an * as the last character, to match multiple groups, eg: "comp.os.linux.*"
would match comp.os.linux.announce, comp.os.linux.answers, etc.. The third part
specifies the group file which contains the parameters to check the articles against.
Note, that if you specified a postfix with the \-p option, then this postfix is attached
to the name of the file when suck looks for it, UNLESS you use the \-k option above.
.PP
.I GROUP_OVERRIDE_MASTER
This allows you to override the default behavior of the master kill file. If this
option is in the master kill file, then even if an article is flagged for deletion
by the master kill file, it is checked against the group files. If the group files
says to not delete it, then the article is kept.
.PP
.I TIEBREAKER_DELETE
This option allows you to override the built-in tie-breaker default. The potential
exists for a message to be flagged by one group file as kept, and another group
file as killed. The built-in default is to then keep the message. The TIEBREAKER_DELETE
option will override that, and caused the article to be deleted.
.PP
.I USE_EXTENDED_REGEX
This option tells suck to use extended regular expressions vice standard regular expressions.
It may used in the master killfile, in which case it applies to all killfiles, or in an
individual killfile, where it only applies to the parameters that follow it in the
killfile.
.PP
.I XOVER_LOG_LONG
This option tells suck to format the killfile generated by from an Xover killfile so that
it looks like an article header. The normal output is to just print the Xover line
from theserver.
.PP
.I PROGRAM=
This line allows suck to call an external program to check each article.
You may specify any arguments in addition to the program name on this line.
If this line is in your suckkillfile, all other lines are ignored. Instead, the
headers are passed to the external program, and the external program determines
whether or not to download the article. Here's how it works. Suck will fork
your program, with stdin and stdout redirected. Suck will feed the headers
to your program thru stdin, and expect a reply back thru stdout. Here's the
data flow for each article:
.RS
1. suck will write a 8 byte long string, which represents the length of the
header record on stdin of the external program. Then length is in ascii,
is left-aligned, and ends in a newline (example: "1234 \\n").
.RE
.RS
2. suck will then write the header on stdin of the external program.
.RE
.RS
3. suck will wait for a 2 character response code on stdout. This response code is
either "0\\n" or "1\\n" (NOT BINARY ZERO OR ONE, ASCII ZERO OR ONE). If the return
code is zero, suck will download the article, if it is one, suck won't.
.RE
.RS
4. When there are no more articles, the length written down (for step 1) will be zero
(again in ascii "0 \\n"). Suck will then wait for the external program to
exit before continuing on. The external program can do any clean up it needs,
then exit. Note: suck will not continue processing until the external program exits.
.RE
.PP
.I PERL=
This line allows suck to call a perl subroutine to check each article. In order
to use this option, you must edit the Makefile, specifically the PERL* options.
If the PERL=
line is in your suckkillfile, all other lines are ignored. Instead, the header
is sent to your perl subroutine, and your subroutine determines if the article
is downloaded or not. The parameter on the PERL= line specifies the file name
of the perl routine eg:
.RS
PERL=perl_kill.pl
.RE
.PP
See the sample/perl_kill.pl for a sample perl subroutine. There are a couple of
key points in this sample. The "package Embed::Persistant;" must be in the perl
file. This is so that any variable names you create will not conflict with variable
names in suck. In addition, the subroutine you define must be "perl_kill", unless
you change the PERL_PACKAGE_SUB define in suck_config.h. Also, your subroutine must
return exactly one value, an integer, either 0 or 1. If the subroutine returns
0, then the article is downloaded, otherwise, the article is not downloaded.
.PP
NOTES: The perl file is only compiled once, before any articles are downloaded.
This is to prevent lengthy delays between articles while the perl routine
is re-compiled. Also, you must use Perl 5.003 or newer. In addition, you
are advised to run 'perl -wc filter' BEFORE using your filter, in order
to check for syntax errors and avoid problems.
.SH SUCKXOVER
If the file
.I suckxover
exists, then suck uses the XOVER command to get information
on the articles and decide whether or not to download the article.
Xover files use the same syntax as suckkillfiles, but supports a subset
of the commands.
.PP
The following killfile commands are not supported in suckxover files:
.RS
NRGROUPS:
.RE
.RS
HEADER:
.RE
.RS
BODY:
.RE
.RS
TIEBREAKER_DELETE:
.RE
.PP
Only the following header lines will be checked:
.RS
Subject:
.RE
.RS
From:
.RE
.RS
Message-ID:
.RE
.RS
References:
.RE
.PP
The behaviour of the size commands (
.I BODYSIZE>, BODYSIZE<, HILINES, and LOWLINES
) specify the total size of the article (not just the body) in
bytes or lines, respectively.
.PP
All other parameters are allowed. However, if you use an invalid parameter,
it is silently ignored.
.SH SUCKXOVER and PROGRAM= or PERL= parameters
These parameters are supported in a suckxover file, however they work slightly
differently than described above. The key difference is that prior to sending
each individual xoverview line to your program, suck will send you the
overview.fmt listing that it retrieves from the server. This overview.fmt
is a tab-separated line, describing the fields in each overview.fmt line.
.PP
For the PROGRAM= parameter, suck will first send your program an 8 byte long
string, which is the length of the overview.fmt. This length is formatted
as the lengths above (see nr1 under PROGRAM=). Suck will then send the overview.fmt.
After that, the flow is as described above. See sample/killxover_child.c for
an example.
.PP
For the PERL= parameter, Your program must have two subroutines. The first
is perl_overview, which will recieve the overview.fmt, and not return anything.
The second subroutine is perl_xover, which will recieve the xoverview line,
and return 0 or 1, as described in the PERL= above. See sample/perl_xover.pl
for an example.
.SH SUCKOTHERMSGS
If
.I suckothermsgs
exists, it must contain lines formatted in one of three ways. The first way
is a line containing a Message-ID, with the <> included, eg:
.RS
<12345@somehost.com>
.RE
This will cause the article with that Message-ID to be retrieved.
.PP
The second way is to put a group name and article number on a line starting
with an !, eg:
.RS
!comp.os.linux.announce 1
.RE
This will cause that specific article to be downloaded.
.PP
You can also get a group of articles from a group by using the following syntax:
.RS
!comp.os.linux.announce 1-10
.RE
.PP
Whichever method you use, if the article specified exists, it will be downloaded,
in addition to any articles retreived via the
.I sucknewsrc.
These ways can be used to get a specific article in other groups,
or to download an article that was killed. These articles
.B ARE NOT
processed through the kill articles routines.
.SH SUCKNODOWNLOAD
If
.I sucknodownload
exists, it must consist of lines contaning a Message-ID, with the <> included, eg:
.RS
<12345@somehost.com>
.RE
This will cause the article with that Message-ID to NEVER be downloaded. The
Message-ID must begin in the first column of the line (no leading spaces). This
file overrides
.I suckothermsgs
so if an article is in both, it will not be downloaded.
.SH POST FILTER
if the
.BI "-y post_filter"
option is specified on the command line in conjunction with any of the batch modes,
then suck will call the post filter specified, after downloading the articles, and
before batching/posting the articles.
The filter is passed the directory where the articles are stored (the -dm option).
The filter program is responsible for parsing the contents of the directory. See
sample/post_filter.pl for a sample post filter. This option was designed to
allow you to add your own host name to the Path: header, but if you need to
do anything else to the messages, you can.
.SH FOREIGN LANGUAGE PHRASES
If the
.BI "-l phrases"
option is specified or the file /usr/local/lib/suck.phrases (defined in suck_config.h)
exists, then suck will load an alternate language phrase file, and use
it for all status & error messages, instead of the built-in defaults. The command line
overrides the build in default, if both are present.
The phrase file contains all messages used by suck, rpost, testhost,
and lmove, each on a separate line and enclosed in quotes. To generate
a sample phrase file, run
.BI "make phrases"
from the command line. This will create "phrases.engl", which is a list
of the default phrases. Simply edit this file, changing the english
phrases to the language of your choosing, being sure to keep the phrases
within the quotes. These phrases may contain variables to print items
provided by the program, such as hostname. Variables are designated
by %vN% where N is a one-up sequence per phrase. These variables may
exist in any order on the phrase line, for example,
.RS
"Hello, %v1%, welcome to %v2%" or
.RE
.RS
"Welcome to %v2%, %v1%"
.RE
are both valid phrases. Phrases may contain, \\n, \\r, or \\t to print a newline, carriage return,
or tab, respectively. Note that the first line of the phrase file is the current version
number. This is checked against the version of suck running, to be sure that the phrases
file is the correct version.
If you modify any of the source code, and add in new phrases, you will need to regenerate
phrases.h, so that everything works correctly. To recreate, just run
.BI "make phrases.h"
from the command line.
.SH SIGNAL HANDLING
Suck accepts two signals, defined in
.I suck_config.h.
The first signal (default SIGTERM) will cause Suck to finish downloading the
current article, batch up whatever articles were downloaded, and
exit, without an error.
The second signal (default SIGUSR1) will cause suck to use the pause values defined with
the -w option (see above).
.SH EXIT CODES
Suck will exit with the following return codes:
.RS
0 = success
.RE
.RS
1 = no articles available for download.
.RE
.RS
2 = suck got an unexpected answer to a command it issued to the remote server.
.RE
.RS
3 = the -V option was used.
.RE
.RS
4 = suck was unable to perform NNTP authorization with the remote server.
.RE
.RS
-1 = general error.
.RE
.SH HISTORY
.RS
Original Author - Tim Smith (unknown address)
.RE
.RS
Maintainers -
.RE
.RS
March 1995 - Sven Goldt (goldt@math.tu-berlin.de)
.RE
.RS
July 1995 - Robert A. Yetman (boby@pixi.com)
.RE
.de R$
Revision \\$$3, \\$$4
..
.SH "SEE ALSO"
testhost(1), rpost(1), lpost(1).
|