Message boards : Server and website : upload problems
Author | Message |
---|---|
Sorry, I am outta here. Taking WAY too much time to babysit uploads of completed WU. Especially since retry interval often exceeds run time of next WU so I get idle GPUs | |
ID: 53777 | Rating: 0 | rate: / Reply Quote | |
I'm running into slow or stalled uploads the last day. | |
ID: 53839 | Rating: 0 | rate: / Reply Quote | |
Still having upload issues. Was great this morning when I was able to clear all the backlog of stalled uploads. | |
ID: 53912 | Rating: 0 | rate: / Reply Quote | |
Seeing: | |
ID: 53978 | Rating: 0 | rate: / Reply Quote | |
Yep. | |
ID: 53979 | Rating: 0 | rate: / Reply Quote | |
same here | |
ID: 53980 | Rating: 0 | rate: / Reply Quote | |
Yea seems like upload server ran out of space. | |
ID: 53982 | Rating: 0 | rate: / Reply Quote | |
Yea seems like upload server ran out of space. this is exactly what someone here in the forum was warning about when these 300.000 tasks were put in place for download and crunching. Obviously, the warning was not taken serious enough :-( | |
ID: 53984 | Rating: 0 | rate: / Reply Quote | |
The infinite-capacity disks were out of stock. | |
ID: 53985 | Rating: 0 | rate: / Reply Quote | |
The infinite-capacity disks were out of stock. Perhaps a Toni-replication tool should be investigated also... | |
ID: 53987 | Rating: 0 | rate: / Reply Quote | |
Bloody Sunday, it's full again! | |
ID: 53993 | Rating: 0 | rate: / Reply Quote | |
The infinite-capacity disks were out of stock. dammit! must be with the toilet paper delivery.... | |
ID: 53995 | Rating: 0 | rate: / Reply Quote | |
If funds were required: donation form is still unavailable | |
ID: 53997 | Rating: 0 | rate: / Reply Quote | |
If funds were required: donation form is still unavailable Ditto here. I also can't seem to create a profile. The bot screening picture doesn't load with the page so it rejects saving the profile. Maybe some HTML code errors? | |
ID: 54004 | Rating: 0 | rate: / Reply Quote | |
Upload server is out of disk space again. | |
ID: 54155 | Rating: 0 | rate: / Reply Quote | |
Same problem for everyone, thanks in advance for the intervention, Toni | |
ID: 54156 | Rating: 0 | rate: / Reply Quote | |
what I don't understand is that there is no estimation of when the disk will be full, while this should be rather easy by simply watching the hourly/daily upload volume. | |
ID: 54157 | Rating: 0 | rate: / Reply Quote | |
For information: | |
ID: 54158 | Rating: 0 | rate: / Reply Quote | |
Suspending to try and fix the full disk. | |
ID: 54159 | Rating: 0 | rate: / Reply Quote | |
Happy to hear from you, Toni. | |
ID: 54160 | Rating: 0 | rate: / Reply Quote | |
This also coincides with an outage I've noted today at Rosetta@home. Rosetta is based in the U.S. (Seattle, Washington). They just have a shortage of work at the moment. Their users have exploded four times since the virus started. | |
ID: 54162 | Rating: 0 | rate: / Reply Quote | |
Suspending to try and fix the full disk. thanks for explaining. After uploads worked again for a few hours this late morning, they have stopped once more since some time ago :-( | |
ID: 54163 | Rating: 0 | rate: / Reply Quote | |
Oh. I had suffer from it. It just started and turn to "Try again later" immediately in few seconds. Ohh... Please! fix it, quick is better. | |
ID: 54164 | Rating: 0 | rate: / Reply Quote | |
The server's disk is just a buffer. It is emptied continuously. Disk full conditions happen when there is even a temporary imbalance between in (uploads) and out (moving to the main servers) rates. A few hours of imbalance are sufficient to fill it. At such high volumes there is no "easy fix". Now I understand the 2 WU per GPU limitation, you're trying to balance the goes inners and the goes outters. | |
ID: 54177 | Rating: 0 | rate: / Reply Quote | |
At such high volumes there is no "easy fix". a larger disk buffer? ____________ | |
ID: 54182 | Rating: 0 | rate: / Reply Quote | |
Having the same issue. 6 GPU tasks refusing to upload. | |
ID: 54205 | Rating: 0 | rate: / Reply Quote | |
This is still a big problem. I have to retry to submit completed WUs several times a day to get idle GPUs working again. Please fix server issue. | |
ID: 55288 | Rating: 0 | rate: / Reply Quote | |
If you can't fix your server problems you could at least let us download a day's worth of WUs. That would be a minimum 12 WUs per GPU. | |
ID: 55289 | Rating: 0 | rate: / Reply Quote | |
This is still a big problem. I haven't seen it, though I am running only two GTX 1060s at the moment. Since your computers are hidden, there is not much more to say. | |
ID: 55291 | Rating: 0 | rate: / Reply Quote | |
Every morning I wake up to most of my GPUs sitting idle waiting for GG to UL & subsequently DL WUs. I know of no other project that is any where near as inefficient at keeping work supplied. | |
ID: 55295 | Rating: 0 | rate: / Reply Quote | |
I'm not having any issue with uploads or downloads. | |
ID: 55297 | Rating: 0 | rate: / Reply Quote | |
I have also frequent upload problems. Not always, not across all my systems, I haven't found a common way of ocurring but it is really happening since many months ago. And the access to the project website is very slow most of the time. | |
ID: 55298 | Rating: 0 | rate: / Reply Quote | |
After about 3 hours all GPUs have gone idle because WUs do not UL and so the paltry 2 WUs do not DL. | |
ID: 55299 | Rating: 0 | rate: / Reply Quote | |
After about 3 hours all GPUs have gone idle because WUs do not UL and so the paltry 2 WUs do not DL. There is a known issue on GPUGrid that after you access the server, there is some dead time before you can access it again. I am not seeing it at the moment, but it bites everyone eventually. That may be it, depending on how many machines you have. I would guess that it is some sort of anti-DDOS protection feature on the campus network, but no one knows (or admits to) what the problem is. | |
ID: 55301 | Rating: 0 | rate: / Reply Quote | |
There is a known issue on GPUGrid that after you access the server, there is some dead time before you can access it again... This problem was discussed at the end of 2019 on thread Unable to load units Currently I have 7 hosts in production, all of them attached to the same local network. I've tested that (for me), when I want to replenish WU buffers, the most effective way is manually asking for WUs one by one host, from lowest to highest local IP. I've configured fixed IP address for each one. These 7 hosts manage 11 GPUs in total (some of them are multiGPU systems), so the maximum WUs I can aspire to simultaneously download from GPUGrid is 22 (two at a time per each GPU)... But most of the time, the whole group is quite well automanaging by means of their individual BOINC Managers, without human intervention. | |
ID: 55303 | Rating: 0 | rate: / Reply Quote | |
no finished task can be upoloaded. | |
ID: 56057 | Rating: 0 | rate: / Reply Quote | |
Same here: 19.12.2020 11:45:10 | GPUGRID | Started upload of 3m0eA01_379_2-TONI_MDADex2sm-43-50-RND8091_0_10 | |
ID: 56058 | Rating: 0 | rate: / Reply Quote | |
Today is Saturday ... I hope they solve the problem next week ... | |
ID: 56062 | Rating: 0 | rate: / Reply Quote | |
Today is Saturday ... I hope they solve the problem next week ... it's not the first time this happens; so I am surprised that they havn't installed any warning system to the effect that someone gets a notification as soon as the disk is about 80% full or so. | |
ID: 56063 | Rating: 0 | rate: / Reply Quote | |
Still down: 12/19/2020 1:13:30 PM (CET) | GPUGRID | [error] Error reported by file upload server: Server is out of disk space | |
ID: 56064 | Rating: 0 | rate: / Reply Quote | |
I notice that most of my initial attempts at uploading reached 100%, but were not acknowledged. Now, if I retry one, it errors out immediately with the 'out of disk space' error. | |
ID: 56065 | Rating: 0 | rate: / Reply Quote | |
The GPUGRID website was unavailable ("not found") during this upload problem yesterday. The problem seems to be more than available space on a task server. | |
ID: 56066 | Rating: 0 | rate: / Reply Quote | |
Have not seen "not found" yet but got same as others and first message is can't open file. Then post no disk space. lör 19 dec 2020 14:16:28 | GPUGRID | [error] Error reported by file upload server: can't open file /home/ps3grid/projects/PS3GRID/upload/239/2f5tX01_450_4-TONI_MDADex2sf-44-50-RND6642_0_0: No space left on device lör 19 dec 2020 14:16:28 | GPUGRID | [error] Error reported by file upload server: Server is out of disk space lör 19 dec 2020 14:16:28 | GPUGRID | Backing off 00:12:59 on upload of 2f5tX01_450_4-TONI_MDADex2sf-44-50-RND6642_0_0 lör 19 dec 2020 14:16:28 | GPUGRID | Backing off 00:15:16 on upload of 2f5tX01_450_4-TONI_MDADex2sf-44-50-RND6642_0_1 lör 19 dec 2020 14:16:29 | GPUGRID | [error] Error reported by file upload server: Server is out of disk space lör 19 dec 2020 14:16:29 | GPUGRID | [error] Error reported by file upload server: Server is out of disk space lör 19 dec 2020 14:16:29 | GPUGRID | Backing off 00:07:54 on upload of 2f5tX01_450_4-TONI_MDADex2sf-44-50-RND6642_0_2 lör 19 dec 2020 14:16:29 | GPUGRID | Backing off 00:05:48 on upload of 2f5tX01_450_4-TONI_MDADex2sf-44-50-RND6642_0_8 | |
ID: 56067 | Rating: 0 | rate: / Reply Quote | |
Disks are full. I'm suspending the project. Sorry for the missed credits. | |
ID: 56069 | Rating: 0 | rate: / Reply Quote | |
Disks are full. I'm suspending the project. Sorry for the missed credits. I'm more sorry for the delay to your research project, but the results can stay here until you're ready to receive them. You and Grosso can take a well-earned break - happy holidays. | |
ID: 56071 | Rating: 0 | rate: / Reply Quote | |
The infinite-capacity disks were out of stock. Maybe you'll get one for Christmas. | |
ID: 56072 | Rating: 0 | rate: / Reply Quote | |
Disks are full. I'm suspending the project. Sorry for the missed credits. which means that the project will not be continued until next year? | |
ID: 56073 | Rating: 0 | rate: / Reply Quote | |
Disks are full. I'm suspending the project. Sorry for the missed credits. Hope things work out for the positive soon. We are here to help out when needed. | |
ID: 56075 | Rating: 0 | rate: / Reply Quote | |
Now that the donation page is working finally, you can help out with running expenses and hardware purchases by donating some funds to the project. | |
ID: 56079 | Rating: 0 | rate: / Reply Quote | |
This project appears to be producing staggering amounts of data as it evolves. Lately I notice a significant increase in users whose RAC is above 1M. Are we now 'outrunning' Grosso? | |
ID: 56083 | Rating: 0 | rate: / Reply Quote | |
Hmmm.. I just got a new GTX 1660 card that is cranking out about 15 times more data than my old GT 730 card. I guess I broke it. LOL | |
ID: 56090 | Rating: 0 | rate: / Reply Quote | |
Misery loves company. Einstein is having upload problems, too. Probably from GPUGRID folks looking for work. | |
ID: 56094 | Rating: 0 | rate: / Reply Quote | |
Misery loves company. Einstein is having upload problems, too. Probably from GPUGRID folks looking for work. their problems are only with the gamma ray tasks. crunch the gravity wave tasks and there are no problems with upload since they use different upload servers. ____________ | |
ID: 56095 | Rating: 0 | rate: / Reply Quote | |
I am not having problems with Folding either. I run that for the GPUs, and BOINC for the CPUs (except for a Ryzen 3950X with a GTX 1070, where I run both on Folding. The CPU alone accounts for around 600k PPD, and the GPU adds 1.4M more). | |
ID: 56096 | Rating: 0 | rate: / Reply Quote | |
Misery loves company. Einstein is having upload problems, too. Probably from GPUGRID folks looking for work. In the Einstein forum I read that upload problems there occur mostly on weekends. Like last weekend also. So we GPUGRID folks may have worsened the situation, but we did NOT create the problem :-) | |
ID: 56098 | Rating: 0 | rate: / Reply Quote | |
Today morning my current tasks were uploaded successfully: 21.12.2020 9:35:52 | GPUGRID | Started upload of 3m0eA01_379_2-TONI_MDADex2sm-43-50-RND8091_0_2 So there are no files in upload list now. But tasks still have "in progress" status. | |
ID: 56117 | Rating: 0 | rate: / Reply Quote | |
Message boards : Server and website : upload problems