kilimanjaro Posted October 19, 2003 Report Share Posted October 19, 2003 I cannot update anything now I get <unable to access hdlist file of "main", medium ignored> I can't get it to un-ignore it either. I have tried with the gui and in a terminal (add.media) It has also ignored cd 1 and contribs so I can't install anything because I can't get the dependencies. Quote Link to comment Share on other sites More sharing options...
WilliamS Posted October 19, 2003 Report Share Posted October 19, 2003 Have you tried urpmi.update -a as root in Konsole? (While online) Quote Link to comment Share on other sites More sharing options...
kilimanjaro Posted October 20, 2003 Author Report Share Posted October 20, 2003 Actually I got contribs working, and I did do urpmi update -a. The problem now is that I am getting the same problem from my main site. Quote Link to comment Share on other sites More sharing options...
kmack Posted October 20, 2003 Report Share Posted October 20, 2003 Most of the time this is due to a busy site that times you out. Sadly, there is no error that tells you this. Try again after letting some time go by. You can also add the --wget switch to force it to use wget instead of curl for downloading. #urpmi.update --wget -a If it doesn't work, you might need to find another site to use. Right now with all the 9.2 fever, the mirrors are busy but it should get better until the public release of the iso's when it will start all over again! Quote Link to comment Share on other sites More sharing options...
kilimanjaro Posted October 20, 2003 Author Report Share Posted October 20, 2003 how does wget differ from curl? Quote Link to comment Share on other sites More sharing options...
kmack Posted October 20, 2003 Report Share Posted October 20, 2003 Good question... they both are basically the same AFAIK, but wget seems a bit more robust and sometimes gets the job done better. Most of the "old pro" crowd seem to recommend using it and I have learned to listen to the voice of experience. Using wget has worked for me a couple times when I had problems getting things downloaded. Another tip is to avoid peak use hours on the internet. In other words find times when your area and the mirror you use are not in times when the most people are logged into it. Normally 0900-1200; 1400-1500; 1700-1800 and 2200-2230 seem to be peak usage times in my area. Your mileage may vary... curl: Summary: Gets a file from a FTP, GOPHER or HTTP server. Description: curl is a client to get documents/files from servers, using any of the supported protocols. The command is designed to work without user interaction or any kind of interactivity. wget: Summary: A utility for retrieving files using the HTTP or FTP protocols. Description: GNU Wget is a file retrieval utility which can use either the HTTP or FTP protocols. Wget features include the ability to work in the background while you're logged out, recursive retrieval of directories, file name wildcard matching, remote file timestamp storage and comparison, use of Rest with FTP servers and Range with HTTP servers to retrieve files over slow or unstable connections, support for Proxy servers, and configurability. Lots of ways to get this kind of info. I often use the REMOVE SOFTWARE option to view where files are and what files are part of the package. Take a look and turn on Max info button. You can do it from cli with rpm too. Or better yet try using man wget and man curl and you can learn the syntax of the commands. I like to do this in Konqueror by typing man:command_name (i.e. man:wget ) in the Location window. It is easier to read in Konqueror. Quote Link to comment Share on other sites More sharing options...
kilimanjaro Posted October 20, 2003 Author Report Share Posted October 20, 2003 Thank you kmack Quote Link to comment Share on other sites More sharing options...
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.