Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Avoid multiple parallel network requests for the same URL #1461

Open
Kazachenko1998 opened this issue Sep 16, 2022 · 3 comments
Open

Avoid multiple parallel network requests for the same URL #1461

Kazachenko1998 opened this issue Sep 16, 2022 · 3 comments
Labels
enhancement New feature or request

Comments

@Kazachenko1998
Copy link

I am developing an application where there can be many identical pictures on one screen (SVG from the server).
When I open the application for the first time, I see multiple parallel requests behind the same picture.

I think it's worth adding a pool of downloads, and if this URL is already being processed, do not make this request again, but wait for the first one to complete and use its result.

override fun onCreate(savedInstanceState: Bundle?) {
        CODE
        Coil.setImageLoader(
            ImageLoader
                .Builder(this)
                .components { add(SvgDecoder.Factory()) }
                .crossfade(true)
                .build()
        )
        CODE
}

AsyncImage(
    placeholder = rememberAsyncImagePainter(R.drawable.ic_plchldr),
    error = rememberAsyncImagePainter(R.drawable.ic_plchldr),
    model = URL,
    contentDescription = URL,
    contentScale = ContentScale.Crop,
    modifier = Modifier.fillMaxSize()
)

image

@Kazachenko1998 Kazachenko1998 added the enhancement New feature or request label Sep 16, 2022
@colinrtwhite
Copy link
Member

colinrtwhite commented Sep 18, 2022

I agree this would be nice to have. Ideally, HttpUriFetcher should reference count the number of responses waiting for its network request. We probably can't merge requests that only share the same URL - we'll need request headers to match as well (at least by default).

Additionally - we can only do this for requests that are going to be written to the disk cache. If they're not being written to the disk cache then we can only read the response body once. There's a few other edge cases that means this would have to be opt-in at least for a while.

@colinrtwhite colinrtwhite changed the title Do not load the same URLs at the same time Sep 18, 2022
@colinrtwhite
Copy link
Member

colinrtwhite commented Sep 28, 2022

Thinking about this more we should also provide a configurable "keep alive" duration for network requests. That way the user can configure how long to wait before cancelling a network request if there's no more consumer for the result.

This would be useful for RecyclerViews and other cases where a request can be restarted multiple times in quick succession.

@gargVader
Copy link

What is the status on this?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
3 participants