Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We��ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

This process is way too slow #199

Closed
u382514 opened this issue May 7, 2021 · 40 comments
Closed

This process is way too slow #199

u382514 opened this issue May 7, 2021 · 40 comments
Labels
bug Something isn't working

Comments

@u382514
Copy link

u382514 commented May 7, 2021

5 minutes to fix bug, 40 minutes to upload/download artifact. Are there any plans in the works to make this more performant?

@u382514 u382514 added the bug Something isn't working label May 7, 2021
@konradpabjan
Copy link
Collaborator

How many files are you uploading? General telemetry indicates that each files takes 100ms to upload with 2 concurrent uploads happening at a time (you can see all this information by turning on step debugging ). Note that uploads from self-hosted runners will be slower.

If you're uploading hundreds or thousands of files than you can create a zip yourself beforehand using the steps here and upload that: https://github.com/actions/upload-artifact#too-many-uploads-resulting-in-429-responses. That will significantly reduce the amount of HTTP calls and your upload will be much faster. You will have a zip of a zip when you download the artifact unfortunately though until we add the option to download files individually from the UI

@u382514
Copy link
Author

u382514 commented May 14, 2021

I tried that too. I zipped it up in the build and tried to upload the artifact and that shaved off about 10 minutes of a 150 meg file. So it still took roughly 30 minutes for just the upload/download piece of it. The build, test, zip and subsequent unzip and push to prod only took about 5 minutes of a 35 minute process. For now I just skip the artifact upload/download process and the time to prod is about 5 minutes total. It seems to be consistently slow too as I ran the process numerous times without any real differences to note.

@BlairMcClelland
Copy link

Any update on this? It is painfully slow compared to Azure Pipeline's Pipeline Artifact offering. Uploading 4500~ files / 350~ MB took <30 secs there, while in Github Actions it's taking around 5 minutes
https://docs.microsoft.com/en-us/azure/devops/pipelines/artifacts/pipeline-artifacts?view=azure-devops&tabs=yaml

@twz123
Copy link

twz123 commented Jul 20, 2022

I noticed that actions/cache is nearly an order of magnitude faster than actions/upload-artifact. In my experiments, I found that a single 400 MiB file takes about 150 seconds to upload and 15 seconds to download (with actions/download-artifact). In contrast, putting it into a cache takes 23 seconds, and restoring the cache takes less than 10 seconds.

So the reason for the slowness does not seem to be insufficient network bandwidth on the actions runner side.

@tamascsaba
Copy link

I have the same issue on large amount of files upload and download are also really slow.

@wind57
Copy link

wind57 commented Aug 1, 2022

uploading is very different then caching. you might need different artifacts for each build, and caching does not achieve that. Suppose you build docker images - which are unique in each build and need to be shared across other steps, and can not really be cached, unless you hack it:

What we have done is something like this:

name: sets environment variables
description: sets environment variables
runs:
  using: "composite"
  steps:
    - name: set env variables
      shell: bash
      run: |
        echo "DOCKER_IMAGES_KEY=$(echo $GITHUB_RUN_ID)" >> $GITHUB_ENV

and then later:

name: download docker images
description: download docker images
runs:
  using: "composite"
  steps:
    - uses: actions/cache@v2
      with:
        path: /tmp/docker/images
        key: docker-images-cache-${{ env.DOCKER_IMAGES_KEY }}
        restore-keys: docker-images-cache-${{ env.DOCKER_IMAGES_KEY }}

Notice the GITHUB_RUN_ID usage. Since this one is unique per run, it achieves "upload/download" via caching, and the time went down in our case from 15 minutes to 1 minute.

@GMNGeoffrey
Copy link

GMNGeoffrey commented Nov 17, 2022

I just attempted to upload a large artifact (7.5G) and discovered just how unusably slow this is. I gave up when I became clear it was going to take around an hour to upload. Yeah, that's not going to work. One thing I notice is that it spends the first 5 minutes "starting upload" whatever that means

Logs
Thu, 17 Nov 2022 22:58:42 GMT With the provided path, there will be 1 file uploaded
Thu, 17 Nov 2022 22:58:42 GMT Starting artifact upload
Thu, 17 Nov 2022 22:58:42 GMT For more detailed logs during the artifact upload process, enable step-debugging: https://docs.github.com/actions/monitoring-and-troubleshooting-workflows/enabling-debug-logging#enabling-step-debug-logging
Thu, 17 Nov 2022 22:58:42 GMT Artifact name is valid!
Thu, 17 Nov 2022 22:58:42 GMT Container for artifact "full-build-dir.tar.zst" successfully created. Starting upload of file(s)
Thu, 17 Nov 2022 22:58:52 GMT Total file count: 1 ---- Processed file #0 (0.0%)
Thu, 17 Nov 2022 22:59:02 GMT Total file count: 1 ---- Processed file #0 (0.0%)
Thu, 17 Nov 2022 22:59:12 GMT Total file count: 1 ---- Processed file #0 (0.0%)
Thu, 17 Nov 2022 22:59:22 GMT Total file count: 1 ---- Processed file #0 (0.0%)
Thu, 17 Nov 2022 22:59:32 GMT Total file count: 1 ---- Processed file #0 (0.0%)
Thu, 17 Nov 2022 22:59:42 GMT Total file count: 1 ---- Processed file #0 (0.0%)
Thu, 17 Nov 2022 22:59:52 GMT Total file count: 1 ---- Processed file #0 (0.0%)
Thu, 17 Nov 2022 23:00:02 GMT Total file count: 1 ---- Processed file #0 (0.0%)
Thu, 17 Nov 2022 23:00:12 GMT Total file count: 1 ---- Processed file #0 (0.0%)
Thu, 17 Nov 2022 23:00:22 GMT Total file count: 1 ---- Processed file #0 (0.0%)
Thu, 17 Nov 2022 23:00:32 GMT Total file count: 1 ---- Processed file #0 (0.0%)
Thu, 17 Nov 2022 23:00:42 GMT Total file count: 1 ---- Processed file #0 (0.0%)
Thu, 17 Nov 2022 23:00:52 GMT Total file count: 1 ---- Processed file #0 (0.0%)
Thu, 17 Nov 2022 23:01:02 GMT Total file count: 1 ---- Processed file #0 (0.0%)
Thu, 17 Nov 2022 23:01:12 GMT Total file count: 1 ---- Processed file #0 (0.0%)
Thu, 17 Nov 2022 23:01:22 GMT Total file count: 1 ---- Processed file #0 (0.0%)
Thu, 17 Nov 2022 23:01:32 GMT Total file count: 1 ---- Processed file #0 (0.0%)
Thu, 17 Nov 2022 23:01:42 GMT Total file count: 1 ---- Processed file #0 (0.0%)
Thu, 17 Nov 2022 23:01:52 GMT Total file count: 1 ---- Processed file #0 (0.0%)
Thu, 17 Nov 2022 23:02:02 GMT Total file count: 1 ---- Processed file #0 (0.0%)
Thu, 17 Nov 2022 23:02:12 GMT Total file count: 1 ---- Processed file #0 (0.0%)
Thu, 17 Nov 2022 23:02:22 GMT Total file count: 1 ---- Processed file #0 (0.0%)
Thu, 17 Nov 2022 23:02:32 GMT Total file count: 1 ---- Processed file #0 (0.0%)
Thu, 17 Nov 2022 23:02:42 GMT Total file count: 1 ---- Processed file #0 (0.0%)
Thu, 17 Nov 2022 23:02:52 GMT Total file count: 1 ---- Processed file #0 (0.0%)
Thu, 17 Nov 2022 23:03:02 GMT Total file count: 1 ---- Processed file #0 (0.0%)
Thu, 17 Nov 2022 23:03:12 GMT Total file count: 1 ---- Processed file #0 (0.0%)
Thu, 17 Nov 2022 23:03:15 GMT Uploaded /home/runner/work/iree/iree/full-build-dir.tar.zst (0.1%) bytes 0:8388607
Thu, 17 Nov 2022 23:03:17 GMT Uploaded /home/runner/work/iree/iree/full-build-dir.tar.zst (0.2%) bytes 8388608:16777215
Thu, 17 Nov 2022 23:03:19 GMT Uploaded /home/runner/work/iree/iree/full-build-dir.tar.zst (0.3%) bytes 16777216:25165823
Thu, 17 Nov 2022 23:03:22 GMT Uploaded /home/runner/work/iree/iree/full-build-dir.tar.zst (0.4%) bytes 25165824:33554431

Running with debug mode gives us some more insights:

Logs
Thu, 17 Nov 2022 23:24:01 GMT ##[debug]Evaluating condition for step: 'Upload artifacts'
Thu, 17 Nov 2022 23:24:01 GMT ##[debug]Evaluating: success()
Thu, 17 Nov 2022 23:24:01 GMT ##[debug]Evaluating success:
Thu, 17 Nov 2022 23:24:01 GMT ##[debug]=> true
Thu, 17 Nov 2022 23:24:01 GMT ##[debug]Result: true
Thu, 17 Nov 2022 23:24:01 GMT ##[debug]Starting: Upload artifacts
Thu, 17 Nov 2022 23:24:01 GMT ##[debug]Loading inputs
Thu, 17 Nov 2022 23:24:01 GMT ##[debug]Evaluating: env.BUILD_DIR_ARCHIVE
Thu, 17 Nov 2022 23:24:01 GMT ##[debug]Evaluating Index:
Thu, 17 Nov 2022 23:24:01 GMT ##[debug]..Evaluating env:
Thu, 17 Nov 2022 23:24:01 GMT ##[debug]..=> Object
Thu, 17 Nov 2022 23:24:01 GMT ##[debug]..Evaluating String:
Thu, 17 Nov 2022 23:24:01 GMT ##[debug]..=> 'BUILD_DIR_ARCHIVE'
Thu, 17 Nov 2022 23:24:01 GMT ##[debug]=> 'full-build-dir.tar.zst'
Thu, 17 Nov 2022 23:24:01 GMT ##[debug]Result: 'full-build-dir.tar.zst'
Thu, 17 Nov 2022 23:24:01 GMT ##[debug]Evaluating: env.BUILD_DIR_ARCHIVE
Thu, 17 Nov 2022 23:24:01 GMT ##[debug]Evaluating Index:
Thu, 17 Nov 2022 23:24:01 GMT ##[debug]..Evaluating env:
Thu, 17 Nov 2022 23:24:01 GMT ##[debug]..=> Object
Thu, 17 Nov 2022 23:24:01 GMT ##[debug]..Evaluating String:
Thu, 17 Nov 2022 23:24:01 GMT ##[debug]..=> 'BUILD_DIR_ARCHIVE'
Thu, 17 Nov 2022 23:24:01 GMT ##[debug]=> 'full-build-dir.tar.zst'
Thu, 17 Nov 2022 23:24:01 GMT ##[debug]Result: 'full-build-dir.tar.zst'
Thu, 17 Nov 2022 23:24:01 GMT ##[debug]Loading env
Thu, 17 Nov 2022 23:24:01 GMT Run actions/upload-artifact@3cea5372237819ed00197afe530f5a7ea3e805c8
Thu, 17 Nov 2022 23:24:01 GMT ##[debug]followSymbolicLinks 'true'
Thu, 17 Nov 2022 23:24:01 GMT ##[debug]implicitDescendants 'true'
Thu, 17 Nov 2022 23:24:01 GMT ##[debug]omitBrokenSymbolicLinks 'true'
Thu, 17 Nov 2022 23:24:01 GMT ##[debug]followSymbolicLinks 'true'
Thu, 17 Nov 2022 23:24:01 GMT ##[debug]implicitDescendants 'true'
Thu, 17 Nov 2022 23:24:01 GMT ##[debug]omitBrokenSymbolicLinks 'true'
Thu, 17 Nov 2022 23:24:01 GMT ##[debug]Search path '/home/runner/work/iree/iree/full-build-dir.tar.zst'
Thu, 17 Nov 2022 23:24:01 GMT ##[debug]File:/home/runner/work/iree/iree/full-build-dir.tar.zst was found using the provided searchPath
Thu, 17 Nov 2022 23:24:01 GMT With the provided path, there will be 1 file uploaded
Thu, 17 Nov 2022 23:24:01 GMT ##[debug]Root artifact directory is /home/runner/work/iree/iree
Thu, 17 Nov 2022 23:24:01 GMT Starting artifact upload
Thu, 17 Nov 2022 23:24:01 GMT For more detailed logs during the artifact upload process, enable step-debugging: https://docs.github.com/actions/monitoring-and-troubleshooting-workflows/enabling-debug-logging#enabling-step-debug-logging
Thu, 17 Nov 2022 23:24:01 GMT Artifact name is valid!
Thu, 17 Nov 2022 23:24:01 GMT ##[debug]Artifact Url: https://pipelines.actions.githubusercontent.com/SaXEyjchYvnWH9Rq3qeZEPLXtO1Hv1A8b0nd0VWR5igNiKNAhf/_apis/pipelines/workflows/3492591056/artifacts?api-version=6.0-preview
Thu, 17 Nov 2022 23:24:01 GMT ##[debug]Upload Resource URL: https://pipelines.actions.githubusercontent.com/SaXEyjchYvnWH9Rq3qeZEPLXtO1Hv1A8b0nd0VWR5igNiKNAhf/_apis/resources/Containers/21359059
Thu, 17 Nov 2022 23:24:01 GMT Container for artifact "full-build-dir.tar.zst" successfully created. Starting upload of file(s)
Thu, 17 Nov 2022 23:24:01 GMT ##[debug]File Concurrency: 2, and Chunk Size: 8388608
Thu, 17 Nov 2022 23:24:01 GMT ##[debug]/home/runner/work/iree/iree/full-build-dir.tar.zst is greater than 64k in size. Creating a gzip file on-disk /tmp/tmp-66835-BA3oAMVklNVD to potentially reduce the upload size
Thu, 17 Nov 2022 23:24:10 GMT Total file count: 1 ---- Processed file #0 (0.0%)
Thu, 17 Nov 2022 23:24:20 GMT Total file count: 1 ---- Processed file #0 (0.0%)
Thu, 17 Nov 2022 23:24:30 GMT Total file count: 1 ---- Processed file #0 (0.0%)
Thu, 17 Nov 2022 23:24:40 GMT Total file count: 1 ---- Processed file #0 (0.0%)
Thu, 17 Nov 2022 23:24:50 GMT Total file count: 1 ---- Processed file #0 (0.0%)
Thu, 17 Nov 2022 23:25:00 GMT Total file count: 1 ---- Processed file #0 (0.0%)
Thu, 17 Nov 2022 23:25:10 GMT Total file count: 1 ---- Processed file #0 (0.0%)
Thu, 17 Nov 2022 23:25:20 GMT Total file count: 1 ---- Processed file #0 (0.0%)
Thu, 17 Nov 2022 23:25:30 GMT Total file count: 1 ---- Processed file #0 (0.0%)
Thu, 17 Nov 2022 23:25:40 GMT Total file count: 1 ---- Processed file #0 (0.0%)
Thu, 17 Nov 2022 23:25:50 GMT Total file count: 1 ---- Processed file #0 (0.0%)
Thu, 17 Nov 2022 23:26:01 GMT Total file count: 1 ---- Processed file #0 (0.0%)
Thu, 17 Nov 2022 23:26:10 GMT Total file count: 1 ---- Processed file #0 (0.0%)
Thu, 17 Nov 2022 23:26:20 GMT Total file count: 1 ---- Processed file #0 (0.0%)
Thu, 17 Nov 2022 23:26:30 GMT Total file count: 1 ---- Processed file #0 (0.0%)
Thu, 17 Nov 2022 23:26:40 GMT Total file count: 1 ---- Processed file #0 (0.0%)
Thu, 17 Nov 2022 23:26:51 GMT Total file count: 1 ---- Processed file #0 (0.0%)
Thu, 17 Nov 2022 23:27:00 GMT Total file count: 1 ---- Processed file #0 (0.0%)
Thu, 17 Nov 2022 23:27:10 GMT Total file count: 1 ---- Processed file #0 (0.0%)
Thu, 17 Nov 2022 23:27:21 GMT Total file count: 1 ---- Processed file #0 (0.0%)
Thu, 17 Nov 2022 23:27:31 GMT Total file count: 1 ---- Processed file #0 (0.0%)
Thu, 17 Nov 2022 23:27:40 GMT Total file count: 1 ---- Processed file #0 (0.0%)
Thu, 17 Nov 2022 23:27:50 GMT Total file count: 1 ---- Processed file #0 (0.0%)
Thu, 17 Nov 2022 23:28:00 GMT Total file count: 1 ---- Processed file #0 (0.0%)
Thu, 17 Nov 2022 23:28:10 GMT Total file count: 1 ---- Processed file #0 (0.0%)
Thu, 17 Nov 2022 23:28:20 GMT Total file count: 1 ---- Processed file #0 (0.0%)
Thu, 17 Nov 2022 23:28:27 GMT ##[debug]The gzip file created for /home/runner/work/iree/iree/full-build-dir.tar.zst is smaller than the original file. The file will be uploaded using gzip.
Thu, 17 Nov 2022 23:28:29 GMT Uploaded /home/runner/work/iree/iree/full-build-dir.tar.zst (0.1%) bytes 0:8388607
Thu, 17 Nov 2022 23:28:30 GMT Total file count: 1 ---- Processed file #0 (0.0%)
Thu, 17 Nov 2022 23:28:31 GMT Uploaded /home/runner/work/iree/iree/full-build-dir.tar.zst (0.2%) bytes 8388608:16777215

/home/runner/work/iree/iree/full-build-dir.tar.zst is greater than 64k in size. Creating a gzip file on-disk /tmp/tmp-66835-BA3oAMVklNVD to potentially reduce the upload size

As you can probably tell by the filename, that is not a great idea. I recall reading that the artifact upload did its own gzip compression (although now I can't find where it's mentioned), but when I tried relying on that, the whole thing was even slower. Using parallel zstd is way faster.

But that alone cannot explain the time here. As noted above, caching is way faster, so this can't only be from limited network bandwidth on the VM. These artifacts are going straight to Azure, probably even in the same data center! There's really no excuse for this to be slow. By contrast, when running on my self-hosted runners in GCP, the gcloud upload of approximately this same file takes under a minute (it's a 7.5G build archive, so unfortunately it's going to be somewhat slow no matter what).

I've traced through the code and whatever it's doing in this while loop is the source of the slowness: https://github.com/actions/toolkit/blob/819157bf872a49cfcc085190da73894e7091c83c/packages/artifact/src/internal/upload-http-client.ts#L321. It's uploading each chunk serially, which probably doesn't help, but gcloud does the same thing with parallel composite upload turned off and does it in the aforementioned minute, so I don't think that can be it. It's recomputing the HTTP headers for each chunk and creating a new file stream. Maybe the underlying client has connection pooling, so the former isn't an issue? I don't see it in here: https://github.com/actions/toolkit/blob/main/packages/http-client/src/index.ts#L328, but I'm no typescript or http request expert. Hard to tell without profiling exactly what the cause is, but profiling does seem to be in order.

@ascopes
Copy link

ascopes commented Jan 19, 2023

Is there any update on this? I have a case where I run tests on several different configurations. The upload artifacts step takes several minutes as a result.

If I upload them all in a single tarball, this is not an issue and the process takes a matter of seconds, but this results in a tarball within a zip folder once the pipeline completes which is far from ideal.

I wonder whether it would make sense for GitHub to pack these into a single tar ball internally to upload and then unpack them serverside afterwards..? Either that or run more uploads in parallel to avoid this delay?

Screenshot_2023-01-19-09-03-54-34_40deb401b9ffe8e1df2f1cc5ba480b12

Ideally, uploading one 40MiB file shouldn't take exponentially less time than uploading the same data in 4000 10KiB files, right? They end up in the same zip file once the job has completed.

@StanlyLife
Copy link

I have the same problem with a project of 23MB it takes ages (almost 1h)
image
image

as mentioned here https://azureossd.github.io/2022/02/07/React-Deployment-on-App-Service-Linux/

From Official Documentation: During upload, each file is uploaded concurrently in 4MB chunks using a separate HTTPS connection per file. Chunked uploads are used so that in the event of a failure, the upload can be retried. If there is an error, a retry will be attempted after a certain period of time.

Uploading will be generally be faster if there are fewer files that are larger in size vs if there are lots of smaller files. Depending on the types and quantities of files being uploaded, it might be beneficial to separately compress and archive everything into a single archive (using something like tar or zip) before starting and artifact upload to speed things up.

A solution is mentioned in the article

@twz123
Copy link

twz123 commented Feb 7, 2023

Stuffing many files into archives files will help, but this action is still way too slow even for single file uploads.

@StanlyLife
Copy link

Stuffing many files into archives files will help, but this action is still #199 (comment) #199 (comment) #199 (comment) even for single file uploads.

Reduced my deploy time from 2,5 hours to 15 minutes, still not good, but a lot better.

@m0un10
Copy link

m0un10 commented Mar 15, 2023

This is insanely broken, how can such a small set of files take so long...
Screenshot 2023-03-15 at 8 33 17 pm

Lots of drop outs like this

Total file count: 36994 ---- Processed file #20114 (54.3%)
An error has been caught http-client index 1, retrying the upload
Error: read ECONNRESET
    at TLSWrap.onStreamRead (node:internal/stream_base_commons:217:20) {
  errno: -104,
  code: 'ECONNRESET',
  syscall: 'read'
}
Exponential backoff for retry #1. Waiting for 5291 milliseconds before continuing the upload at offset 0
Finished backoff for retry #1, continuing with upload
Total file count: 36994 ----
@mattgodbolt
Copy link

Also seeing really slow upload speed on GHE: we have a ~135MB tar.xz file that takes ~2m30 to upload. Downloading in a later stage takes 11d. Is there some on-the-fly-ZIPfiling we can disable? I don't need my file to be wrapped in a ZIP, and I suspect the attempts to further compress my tarball are what's taking the time.

@mattgodbolt
Copy link

Can anyone recommend an action that stores things on S3-compatible storage instead? This is beyond a joke, of my build, I spend nearly 40% of the time (~2m30) waiting for an upload.

@twz123
Copy link

twz123 commented Oct 5, 2023

I have a gut feeling that this issue might have been addressed in actions/toolkit#1488, so upload speeds might improve whenever actions/upload-artifact@v4 gets released?

@fregante
Copy link

fregante commented Oct 8, 2023

From that issue:

his PR adds zip creation + upload functionality to the upcoming v2.0.0 version of @actions/artifact. This in turn will power new v4 versions of upload-artifact and download-artifact.

So yes! This will be fixed in upload-artifact@v4 💚

@tech234a
Copy link

tech234a commented Oct 19, 2023

I see there is an actions/upload-artifact@v4-beta branch that incorporates these changes, but it doesn't seem to be working at the moment. It outputs: Error: Unable to get the ACTIONS_RESULTS_URL env variable. Has anyone been able to get this to work?

My understanding is that this environment variable is supposed to be provided by the runner automatically.

@Ansuel
Copy link

Ansuel commented Oct 19, 2023

@tech234a we can only wait... they already implemented the fix just this actions needs to be updated with the new way.

@prplecake
Copy link

xkcd-compiling-uploading-artifacts

@Ansuel
Copy link

Ansuel commented Dec 15, 2023

With v4... From 20 minutes to upload and download 5 gb of artifact to 1.30-2 minutes... Feel good! Thanks a lot for v4.

@Amerlander
Copy link

Amerlander commented Dec 16, 2023

Up- and download striped down from 5 minutes to 4 – 12 seconds (~2300 files, 100mb).

@Ansuel
Copy link

Ansuel commented Dec 16, 2023

I'm noticing timeout and internal error once in a while but I will keep it monitored... Maybe an idea is to introduce some kind of retry?

@donv
Copy link

donv commented Dec 17, 2023

From one minute to 3-7 seconds. Thank you!

@unkcpz
Copy link

unkcpz commented Dec 18, 2023

The v4 improved the perfermance quite a lot, for our workflow from 10min -> 20s!
However, we noticed some timeout when running the action on self-runner host, but sometimes the workflow passed. There is a breaking change related to the self-hosted runner in https://github.com/actions/toolkit/tree/main/packages/artifact#breaking-changes, what confused me is the workflow not always fail.

@konradpabjan
Copy link
Collaborator

konradpabjan commented Dec 19, 2023

As some have already noticed, v4 is officially out! 😃 https://github.blog/changelog/2023-12-14-github-actions-artifacts-v4-is-now-generally-available/

Recommend switching over as one of the big improvements we focused on was speed.

From our testing we've seen upload speed increase by up to 98% in worst case scenarios. We'll have a blog post sometime in January with more details + some data.

Here is a random example from one of our runs comparing v3 to v4

Image

https://gh.io/artifact-v3-vs-v4

If there are any issue likes timeouts or upload failures then please upon up new issues (there are already a few related to v4) so we can investigate them separately.

Closing out!

@miladdastyar77
Copy link

@ElvenSpellmaker
Copy link

@konradpabjan Any idea when this will come to GHES please? This current client is uploading large artefacts using GHES and v3 is really slow.

@kronenthaler
Copy link

@konradpabjan Any idea when this will come to GHES please? This current client is uploading large artefacts using GHES and v3 is really slow.

It seems it is in the roadmap for oct-dec this year github/roadmap#930

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working