Jump to content

skyhawk

Members
  • Posts

    69
  • Joined

  • Last visited

About skyhawk

  • Birthday 11/14/1948

Extra Info

  • Your CPU
    Pentium III 933 MHz
  • Your Graphics Card
    Nvidia TNT2 Pro
  • Your RAM
    256 MB
  • Your Hard Drive
    Maxtor 40 GB
  • Your Sound Card
    Intel onboard audio
  • Your Operating System
    Mandriva Free 2007.0 kernel 2.6.17-5mdv
  • Your Monitor
    HP L1720 LCD
  • Your Keyboard
    Compaq
  • Your Mouse
    Logitech PS/2 two-button
  • Your Case
    Compaq Deskpro EN

Contact Methods

  • Website URL
    http://
  • ICQ
    0

Profile Information

  • Location
    Kankakee, IL USA

skyhawk's Achievements

casual

casual (2/7)

0

Reputation

  1. I was provided with the Free-2010.0-Dual iso on disc for evaluation. The iso creates an install disc, not a Live CD. Auto-detection installed the 32-bit version on my Compaq Deskpro EN Pentium III 933 MHz 256 MB RAM with no problems. The apps provided by the install disc are very bare-bones, and no special considerations are given for dial-up users. GUI apps are very few in number: no gui text editor, no gui e-mail client, no gui multimedia sound app, no gui graphics app. The overall performance is amazingly fast, however, and it might be a good choice for someone with broadband access via a network card. I was forced to stay with PCLinuxOS 2009.4 LXDE, which has all essential extras on the Live CD, including wvdial for dial-up. I am very pleased with it thus far, but somewhat disappointed that Mandriva does not yet have an equivalent offering.
  2. The only time I have seen such as you describe occur is when I edited an image and saved it in another format. Is this true in your case? If so, save in the same format, or another format that compresses the filesize.
  3. Very interesting information. My soon-to-arrive PCLinuxOS discs will be treasured collector's items. Will the personalities moving to the newly named Linux distro be changed into "new" personalities to match the new distro? I very much doubt it. In my opinion there are too many Linux distros already and illustrious talent is spread too thinly. I would like to see those developers join an already established distro that is compatible with their mindset.
  4. I did a considerable amount of additional Googling and I discovered that I have all the necessary tools to convert recorded ASF streams to WAV or other formats, particularly after adding "mencoder" to my installed packages. Finding the appropriate stream to record was also a major factor contributing to success. All ASF streams do not seem to be created equal. As an example, use the following command-line in Konsole to capture the "SKY.fm - Mostly Classical" ASF audio stream: mplayer -cache 128 -dumpstream -msglevel all=-1 -nojoystick -nolirc mms://wstream5d.di.fm/classical_low/ This is a 20 kbps (low-bit rate) stream that works fine with a dial-up modem. The stream will be saved with the filename "stream.dump". Use the "-dumpfile" option to save with the filename of your choice. See "man mplayer" for more details. Note that win32-codecs must be installed to handle ASF streams. Also note that audio will not be heard as it is recorded, so it is helpful to choose streams originating from sites that post playlists. On my system, running Mandriva 2007.0, pressing "q" does not stop "mplayer" recording; I must open another desktop, open "Process Table", and kill "mplayer". Pressing "q" does stop "mplayer" when listening only. The following command-lines can then be used to convert the ASF audio file to WAV, then MP3 format (if "lame" is installed to handle MP3 conversion). mplayer -msglevel all=-1 -nojoystick -nolirc input.asf -vc null -vo null -ao pcm:waveheader:file=output.wav lame -h -m s input.wav output.mp3 The above procedure works fine for me, and it can be streamlined considerably by those who enjoy writing bash scripts. I do not do much stream recording, so I will leave the script writing to someone else. The following ASF stream (32 kbps) can be captured, but I am unable to play the file, or convert it to another format. mplayer -cache 128 -dumpstream -msglevel all=-1 -nojoystick -nolirc mms://rx-wes-sea154.rbn.com/farm/pull/tx-rbn-sea003:1459/wmtencoder/wcpe/wcpeint/wmlive/wcpewin.asf "MPlayer" gives the error message "encrypted VOB file". So, do not be surprised if you find some cases such as this one. Remove the "-msglevel all=-1" option to see all warning and error messages. I also want to mention that "vsound" can be used to listen and record RM audio streams simultaneously, while "Audacious" can be used similarly for OGG and MP3 audio streams. Be advised that I use the OSS audio driver (i810_audio) only. Your results might vary.
  5. Still using 2007.0, which is my first Linux distro, and my one-and-only OS. I will be keeping it on this hard drive unit as a back-up. Current plans are to buy a used Pentium 4 system unit and install the current version when KDE issues have been solved. In the meantime, I have ordered copies of PCLinuxOS 2009.1 KDE and Gnome to keep on-hand if KDE development remains troubled. I would prefer to update only when packages are no longer available in repositories, not annually or semi-annually as some do.
  6. Thanks for your reply, scarecrow. Valuable information, as always. I will get the needed packages for VLC and mencoder.
  7. Is a plugin available for "Audacious" that will enable it to handle ASF, ASX streaming audio? The "Audacious" website seems to say "no", but Googling seems to say "maybe", as a non-free plugin. If a packaged plugin is available, where can I find it? I like using "Audacious" because it permits listening and saving OGG streams simultaneously, plus MP3 streams as well. I would like to do the same with ASF, ASX streams. MPlayer handles ASF, ASX streaming audio, but does not permit listening and saving (recording) simultaneously. Using the following command-line in Konsole (as an example): mplayer -cache 256 -dumpaudio -dumpfile stream.asf -msglevel all=-1 -nojoystick -nolirc mms://rx-wes-sea154.rbn.com/farm/pull/tx-rbn-sea003:1459/wmtencoder/wcpe/wcpeint/wmlive/wcpewin.asf I can save (record) an ASF stream. Once I have done that, is there a way to convert the ASF file to WAV format, so that I can make an audio disc, or use other software to do further conversion? I am using Mandriva 2007.0, kernel 2.6.17-5mdv.
  8. I am considering "Krecord" as a means to transfer audio from audio cassettes to CD (audio cassette to WAV, then WAV to CD). No package is available for Mandriva 2007.0, so I will need to compile from source (krecord-1.16.tar.gz). In Krecord's README file, I see the following: If you run in trouble make sure you have set the QTDIR and KDEDIR environment variables, like this: $ export QTDIR=/usr/lib/qt3 $ export KDEDIR=/opt/kde3 Of course you have to adopt the values to match the installation paths on your system ... When I run "env" from the CLI, I see that my QTDIR is set as needed, but I need to set KDEDIR to "/usr/lib/kde3", and I want to set that environment variable permanently, making it unnecessary to add it every time at boot-up. What is the best way to do that? I am the only user on my computer. Also, if anyone has any experience using Krecord with on-board audio (i810_audio : Intel Corp.|ICH2 810 Chipset AC'97 Audio Controller [MULTIMEDIA_AUDIO] (vendor:8086 device:2445 subv:0e11 subd:000b) and using "line-in" as input, please share your results. I do not use ALSA, only OSS. [moved from Software by spinynorman]
  9. I have a simple question pertaining to the meaning of some specific run command options that I see when looking at menu items listed in the KDE Menu Editor. The options I list below were automatically added when packages were installed. amarok %U kmix -caption "%c" %i %m easytag %F "%U" is the most common option added automatically. I have "Googled" for answers without success. So, if anyone can enlighten me, please do so. I am running Mandriva 2007.0, KDE 3.53.
  10. This follow-up to my initial post on this topic should be of interest to those who use dial-up for Internet access and who want a means to save multi-segment downloads session-to-session. cURL is the application you will want to use. Assuming you have cURL installed, open Konsole, <cd> to where you want the download to be saved, and, as an example, type the following command-line: curl -m 3600 -o FILE -v -# URL The remote file (URL) will be saved in the pwd as local file, named FILE (use any local filename you wish, as long as it is not the same as the remote filename). cURL will automatically time-out at 3600 seconds (one hour) in this example. Omit or modify the "-m" option as you wish. The "-v" option tells cURL to function in verbose mode (recommended), and the "-#" option uses the # character for the progress meter. View the page source for each download page to view the desired download link, then copy-and-paste it into the command-line above. Note that if your chosen URL reads something like: http://www.FreeForTheTaking.com/Really Big File.zip where there are spaces in the remote filename, you will need to insert %20 for every space in the remote filename. Thus, the URL in the command-line will read: http://www.FreeForTheTaking.com/Really%20Big%20File.zip Let's assume that you used the "-m 3600" option, cURL automatically timed-out at 3600 seconds, and you have only downloaded 15 megabytes of the 28 megabytes remote file. For the next session, do as above, but use the command-line: curl -C - -m 3600 -o FILE -v -# URL This will cause cURL to continue downloading local FILE at the correct offset of the remote file. Do not forget the " -" after "-C". In this example, two sessions were sufficient to download the complete remote file, but for even larger remote files repeat the procedure as needed. I am forced to restrict my download sessions to 3600 seconds, due to limitations imposed by my ISP. I choose to use a local filename that is different from the remote filename because, if this is not done, I found that in certain instances where the path on the server to the remote file changes every session (security precautions), each download segment will be saved locally as separate files, not appended to one another as they should be. Read the man page for cURL for information on all options available. cURL is a great tool for saving multi-segment downloads. It works flawlessly, if used properly.
  11. If I am not mistaken, kget is a GUI front-end for Wget. Correct me if I'm wrong. I do have kget installed, by the way. I found the home page for Retriever Download Manager and it looks like it offers all the features I want. It requires JRE 1.5. Retriever is now in version 1.3.
  12. I have dial-up Internet access and I want to find a way to download games or other applications (usually archived in zip format) in segments that can be saved to disk one day, then resumed another day, and so on until the complete archive has been downloaded. Segments will generally not exceed 15 MB in size, corresponding to about one hour of download time per session. I see from the Wget man page that Wget will allow such behavior using the "c" (continue) command-line parameter. However, the downloads I am primarily seeking are protected by layers of security to deter "link stealers." Thus, if a URL for a download can be resolved and can be seen on-screen, it cannot be used after the session has ended. The URL changes each time a new session is started. I also see from the Curl man page that it accepts the "--referer" command-line parameter, which might make it a more likely candidate for segmented downloads than Wget. I would guess that I could use my web browser to work through the layers of security, resolve a download's URL, then enter the URL into the command-line for starting Wget, but this creates another problem. Some sites will not allow simultaneous downloading of muliple files, nor will they allow download managers that use multiple connections to accelerate downloads. Using a web browser and Wget to simultaneously download the same archive, even for a fraction of a second, would probably get me banned from some websites. So, does anyone know a good way, if any, to do what I want to do? My primary objective is to be able to resume a partial file download that has been saved to hard disk during a previous session. My secondary concern is the best way to get around the "multiple-file-download" ban, by not accidently triggering the ban. The following is taken from the HOTU FAQs page, for those who want to experiment: "If you've got the referer setting in your download manager set up correctly, but you can't convince it to accept the link, make a bookmark with this link: java script:document.writeln(document.forms[document.forms.length - 1].action + '?firewall=' + (document.getElementsByName('firewall')[0].checked ? 1 : 0) + '&code=' + document.getElementsByName('code')[0].value); Then, when you're ready to download, fill in the form as usual, but instead of clicking the "Go" button, select this bookmark. Then copy-and-paste the URL it gives you into your download manager. I repeat: this will not work unless your referer setting is correctly set up and you fill out the form on the download page completely." I have not yet been able to get a URL to copy-and-paste, as per the above instructions. My one-and-only operating system is Mandriva 2007.0, kernel 2.6.17-5mdv. I primarily use Firefox 1.5.0.7, which came with my Mandriva installation discs, but I also have Firefox 3.0 and Opera 9.50 installed. A download manager that allows session-to-session segmented downloads, that allows web browser integration, and that allows referer entry, would be ideal.
  13. I finally made the effort to install Zdoom 2.2.0 and it was time well spent. My first attempt to compile the source code was halted, due to two missing development libraries, but after those were installed my second attempt was successful. No loading errors were encountered when Zdoom was first launched. A few minutes were spent with Doom II, then I loaded Eternal Doom IV, which is still a work in-progress. Zdoom is required to run Eternal Doom IV. Eternal Doom IV is nothing less than spectacular, even considering that many levels have yet to be added. It is a great leap forward from the preceding final release of Eternal Doom. The graphics are beautiful and very realistic, running fluidly on my Compaq Deskpro EN, Pentium III 933 MHz with nVidia Riva TNT2 AGP card, Mandriva 2007.0, kernel 2.6.17-5mdv. Sound is excellent; far superior to DosBox 0.72. The game engine accepts OSS with no complaints. The first level (hub) of Eternal Doom IV begins with Map 8; warp to start the game there. Those who want to install Zdoom 2.2.0 and follow Eternal Doom IV as it progresses should print a copy of "Compile Zdoom on Linux", available on the Zdoom Wiki pages, and read it thoroughly. The dependencies listed therein should be interpreted to mean, "development libraries required, also", although it is not explicitly stated as such. FMOD 3.75 was one of the requirements for my install, but only five files contained in the tarball are actually needed by Zdoom 2.2.0. Team TNT has done meticulous, ground-breaking work with the initial levels. The levels to come are being anxiously awaited. As I wait, I will find a safe niche somewhere in an immense courtyard, listen to the crickets chirp in the darkness of a summer evening, and try to dodge fireballs hurled my way.
  14. As an update, let me mention that PeaZip 2.1 recently was released. Unlike previous releases that were plagued with 'garbled' text in the last column when an archive was opened, PeaZip 2.1 is free of this little glitch. I have both the GTK1 and GTK2 standalone versions installed under /usr/local and both are functioning very nicely, although the GTK2 version opens much faster (3 seconds) than the GTK1 version (12 seconds) on my Pentium III 933 MHz machine.
  15. To save time and effort, buy the release version of your choice on CD's or DVD. I received very satisfactory service from LinuxCDs.org. There are other companies that can provide install discs. The cost is minimal. Once you have installed the basic OS, you can download additional packages from the repositories. I save all my downloaded packages on CD, just in case I might need them in the future. If you download and install packages using URPMI, they will be installed, but not saved, under normal circumstances. You will need to use WGET, or something similar, to save additional packages to hard disk.
×
×
  • Create New...