2022-11-23 19:39:39,220 INFO Retrying now...
2022-11-23 19:39:39,261 WARNING Recovering from connection error [HTTPSConnectionPool(host='download-0018.nuist.love', port=443): Max retries exceeded with url: /cache-compute-0018/cache/data7/adaptor.mars.external-1669189491.3065188-20059-18-4b76a102-6efc-4cea-a69b-ea964a69cb5c.grib (Caused by NewConnectionError('<urllib3.connection.HTTPSConnection object at 0x00000207572DCCA0>: Failed to establish a new connection: [Errno 11001] getaddrinfo failed'))], attemps 79 of 500
2022-11-23 19:39:39,261 WARNING Retrying in 120 seconds
2022-11-23 19:41:39,273 INFO Retrying now...
2022-11-23 19:41:39,313 WARNING Recovering from connection error [HTTPSConnectionPool(host='download-0018.nuist.love', port=443): Max retries exceeded with url: /cache-compute-0018/cache/data7/adaptor.mars.external-1669189491.3065188-20059-18-4b76a102-6efc-4cea-a69b-ea964a69cb5c.grib (Caused by NewConnectionError('<urllib3.connection.HTTPSConnection object at 0x00000207572DD2D0>: Failed to establish a new connection: [Errno 11001] getaddrinfo failed'))], attemps 80 of 500
2022-11-23 19:41:39,313 WARNING Retrying in 120 seconds
2022-11-23 19:43:39,320 INFO Retrying now...
2022-11-23 19:43:39,391 WARNING Recovering from connection error [HTTPSConnectionPool(host='download-0018.nuist.love', port=443): Max retries exceeded with url: /cache-compute-0018/cache/data7/adaptor.mars.external-1669189491.3065188-20059-18-4b76a102-6efc-4cea-a69b-ea964a69cb5c.grib (Caused by NewConnectionError('<urllib3.connection.HTTPSConnection object at 0x00000207572DCCA0>: Failed to establish a new connection: [Errno 11001] getaddrinfo failed'))], attemps 81 of 500
2022-11-23 19:43:39,391 WARNING Retrying in 120 seconds




Listed above are the information that i met.

Even when i rerun the script, situation still exist? Is this a common scene?

7 Comments

  1. Hi, it's possible that when you ran the request script, that a CDS system session was taking place (these often are done on a Wednesday). Was it working ok for you at the end of last week?

    Thanks,

    Kevin

    1. Hi, Kevin:

      The system is working now!

      And I'll try to get data on non-Wednesday morning.

      Thanks!

      Davy

  2. Hi both. Im also getting this problem and have for the past 2 weeks. Its always seemingly at the same point as highlighted below. Any thoughts welcome. Ive tried changing timeout options on the APi call etc:. Ive checked my online status and the file is ready to download but it always disconnects:


    2022-12-14 14:05:32,467 INFO Downloading https://download-0013-clone.copernicus-climate.eu/cache-compute-0013/cache/data3/adaptor.mars.external-1671022383.947712-9187-16-12284ae0-97af-415e-9f84-f35588f5dad8.grib to /Workspace/ERA5test/ANOG__ML.20211201.77.78.grb (1.9G)
    2022-12-14 14:10:28,288 INFO Download rate 6.6M/s
    marsclass: EA
    dataset: None
    type: AN
    levtype: SFC
    levelist: 1
    repres:
    date: 20211201/to/20211203
    resol: 799
    stream: OPER
    area: 60.0/-24.0/40.0/20.0
    time: 00/01/02/03/04/05/06/07/08/09/10/11/12/13/14/15/16/17/18/19/20/21/22/23
    step: 00
    expver: 1
    number: OFF
    accuracy: 24
    grid: 0.28125/0.28125
    gaussian:
    target: /Workspace/ERA5test/ANOG__SL.20211201.77.78.grb
    param: 141.128/151.128/164.128/165.128/166.128/167.128/168.128/129.128/172.128
    target: /Workspace/ERA5test/ANOG__SL.20211201.77.78.grb
    RETRIEVE ERA5 WITH CDS API!
    2022-12-14 14:10:28,535 INFO Welcome to the CDS
    2022-12-14 14:10:28,536 INFO Sending request to https://cds.climate.copernicus.eu/api/v2/resources/reanalysis-era5-single-levels
    2022-12-14 14:10:28,666 INFO Downloading https://download-0003-clone.copernicus-climate.eu/cache-compute-0003/cache/data2/adaptor.mars.internal-1670933062.7962708-12283-17-90880781-80ab-4830-8b89-18a5da2cd5d9.grib to /Workspace/ERA5test/ANOG__SL.20211201.77.78.grb (5.5M)
    2022-12-14 14:12:38,219 WARNING Recovering from connection error [HTTPSConnectionPool(host='download-0003-clone.copernicus-climate.eu', port=443): Max retries exceeded with url: /cache-compute-0003/cache/data2/adaptor.mars.internal-1670933062.7962708-12283-17-90880781-80ab-4830-8b89-18a5da2cd5d9.grib (Caused by NewConnectionError('<urllib3.connection.VerifiedHTTPSConnection object at 0x7ff4f565f6d8>: Failed to establish a new connection: [Errno 110] Connection timed out',))], attemps 0 of 500
    2022-12-14 14:12:38,219 WARNING Retrying in 120 seconds
    2022-12-14 14:14:38,331 INFO Retrying now...



    1. import cdsapi
      from win32com.client import Dispatch
      from subprocess import call
      import calendar
      # XDW.SWJTU

      def IdmDownloader(request_url, storage_path, file_name):
      # IDM安装目录
      idm_engine = "C:\\Program Files (x86)\\Internet Download Manager\\IDMan.exe"
      # 将任务添加至队列
      call([idm_engine, '/d', request_url, '/p', storage_path, '/f', file_name, '/a'])
      # 开始任务队列
      call([idm_engine, '/s'])

      if __name__ == '__main__':
      c = cdsapi.Client()
      KeyWordsDic = {
      'date': '20190201',
      'levelist': '1/to/137',
      'levtype': 'ml',
      'param': 133, # q
      'stream': 'oper',
      'time': '00/to/23/by/6',
      'type': 'an',
      'area': '90/-180/-90/180',
      'format': 'netcdf',
      'grid': '1.0/1.0',
      }
      # /01:00:00/02:00:00/03:00:00/04:00:00/05:00:00/06:00:00/07:00:00/08:00:00/09:00:00/10:00:00/11:00:00/12:00:00/13:00:00/14:00:00/15:00:00/16:00:00/17:00:00/18:00:00/19:00:00/20:00:00/21:00:00/22:00:00/23:00:00',

      # year/month 逐时刻下载ym月的ERA5数据
      for y in range(2009, 2010):
      for m in range(6, 13):
      d_ym = calendar.monthrange(y, m)[1] #
      if m == 1:
      start = 1
      else:
      start = 1
      for d in range(start, d_ym + 1):
      # print(m)
      # daycount_thismonth = calendar.monthrange(int(y), m)[1]
      # KeyWordsDic['date'] = y + '-' + str(1).zfill(2) + '-01/to/' + \
      # y + '-' + str(1).zfill(2) + '-' + str(daycount_thismonth).zfill(2)

      # 以短划线连接的日期
      date_ = str(y)+ str(m).zfill(2) + str(d).zfill(2)
      KeyWordsDic['date'] = date_
      print("%d %d %d" % (y,m, d))
      date = str(y).zfill(2) + str(m).zfill(2) + str(d).zfill(2)
      File_Name = 'ERA5_' + 'q' + '_' + date_ + '.nc'# + '.nc'
      r = c.retrieve('reanalysis-era5-complete', KeyWordsDic,)
      Request_url = r.location
      path = "E:\\XDW\\Atm_RawData\\ERA5_DATA\\Daily\\1Degree\\q"
      # IdmDownloader(Request_url, path, File_Name)
      thunder = Dispatch('ThunderAgent.Agent64.1')
      thunder.AddTask(Request_url, File_Name, path, '', '', 1, 1, 5)
      thunder.CommitTasks()
      1. This is the script i use when im retrieving ERA5 model level data which I hope can be a reference for you my friend.

        Maybe you can seperate you variables into different scripts. 

        And if youre in CN, maybe some VPN host can help you.


  3. Hi David, are you still seeing this issue? if so can you share the CDS API script you are running, please?

    Thanks,

    Kevin

    1. Hi Kevin.

      Ive uploaded my script.

      Hope it will help.

      Davy.