Any data-fetching screen typically transitions through states such as idle, loading, loaded, and failure. In this post, I’ll demonstrate the evolution of a pattern, starting with a Combine-based implementation and refactoring it to a simpler solution using async/await and Swift’s native Observation-backed AsyncSequence.

States

I’ll represent states as an enum, along with a LoadingProgress object for more granular updates.

public enum LoadingState<Value: Hashable & Sendable>: Hashable, Sendable {
    case idle
    case loading(LoadingProgress?)
    case failure(HashableError)
    case loaded(Value)

    public static func failure(_ error: any Error & Sendable) -> LoadingState {
        .failure(HashableError(error))
    }
}

public struct LoadingProgress: Hashable, Sendable {
    public let isCanceled: Bool?
    public let message: String?
    public let percent: Int? // 0 to 100
}

HashableError wraps any Error & Sendable to preserve Hashable/Equatable semantics while still exposing the underlying error via its error property.

Combine-Based

To standardize the handling of loadable content, we can define a protocol. A Combine-based version of this protocol might look like this:

/// An object that loads content (Legacy Combine version).
public protocol Loadable_Combine {
    associatedtype Value: Sendable
    
    /// Emits states about the loading process.
    var state: PassthroughSubject<LoadingState<Value>, Never> { get }
        
    /// Initiates the load process.
    func load()
}

This approach worked, but it relied on the PassthroughSubject, sink, and AnyCancellable boilerplate common to Combine. While powerful, modern Swift concurrency offers a cleaner path.

AsyncSequence

Apple’s focus has clearly shifted to async/await and AsyncSequence. They allow us to consume streams of values with a simple for await loop, integrating naturally with Structured Concurrency and removing the need for manual subscription management. If needed, async algorithms provides operators similar to Combine.

Our modern Loadable protocol looks like this:

@MainActor
public protocol Loadable {
    associatedtype Value: Hashable, Sendable

    /// An asynchronous sequence that publishes the loading state.
    var state: any AsyncSequence<LoadingState<Value>, Never> { get }

    /// The latest state for quick sync when views reappear.
    var currentState: LoadingState<Value> { get }

    /// Flag indicating if the loading operation has been cancelled.
    var isCanceled: Bool { get }

    /// Cancels the ongoing loading operation.
    func cancel()

    /// Resets the loadable to its initial state.
    func reset()

    /// Initiates the loading of the value.
    func load() async
}

Base implementation

Implementing this protocol is more verbose than using Combine, but that can be solved with a base implementation. Simply subclass and override the fetch() method. BaseLoadable drives the Observation-backed AsyncSequence and yields .loading, .loaded, and .failure states for you.

@MainActor
class UserLoader: BaseLoadable<User> {
    override func fetch() async throws -> User {
        // Your async loading logic here
        try await Task.sleep(nanoseconds: 1_000_000_000)
        
        // Just return the value or throw an error
        return User(name: "Jane Doe")
    }
}

The LoadingView

With this new model, the LoadingView also becomes simpler. It no longer needs a ViewModel; it can directly observe the Loadable object’s state using the .task modifier.

struct LoadingView<L: Loadable, Content: View>: View {
    private var loader: L
    private var content: (L.Value) -> Content
    @State private var loadingState: LoadingState<L.Value> = .idle

    init(loader: L, @ViewBuilder content: @escaping (L.Value) -> Content) {
        self.loader = loader
        self.content = content
    }

    var body: some View {
        Group {
            switch loadingState {
            case .idle:
                ProgressView("Loading...")
            case .loading(let progress):
                ProgressView(progress?.message ?? "Loading…")
            case .loaded(let value):
                content(value)
            case .failure(let error):
                Text(error.error.localizedDescription)
            }
        }
        .onAppear {
            if loadingState != loader.currentState {
                loadingState = loader.currentState
            }
        }
        .task {
            for await state in loader.state {
                self.loadingState = state
            }
        }
    }
}

State Persistence with Observation

A key problem with AsyncStream is that it’s a “one-time” event pipe. If you navigate away from a view, its .task is cancelled. When you navigate back, a new observer is created, but the AsyncStream doesn’t “replay” its last value. This causes the UI to revert to its .idle state, even if data was already loaded.

The latest library uses Swift’s Observation framework to back the state AsyncSequence, which means:

  • The latest value is replayed to new observers automatically.
  • Multiple observers can consume the same stream safely.
  • currentState gives you an immediate snapshot for view re-sync on appear.

BaseLoadable, RetryableLoader, and ConcurrencyLimitingLoadable all use Observation-backed streams. For types that still use AsyncStream internally (for example, DebouncingLoadable), wrap them in another loader if you need multi-observer replay semantics.

Composing loaders

The Loadable protocol and BaseLoadable class create a powerful foundation for composition. The library includes wrappers that add functionality to any Loadable object.

Retry

Loaders can be composed. Wrapping a loader RetryableLoader provides automatic retry capabilities with exponential backoff.

// Create a loader that might fail
let flakeyLoader = FlakeyLoader(successAfterAttempts: 3)

// Wrap it to add retry logic
let retryableLoader = RetryableLoader(
    base: flakeyLoader,
    maxAttempts: 5
)

// Use it in the view. It will automatically retry on failure.
LoadingView(loader: retryableLoader) { ... }

Debounce

Another example for composition is the debouncing of values in a search field. This wrapper delays load calls until the user stops typing.

// Create a loader that performs a search
let searchLoader = SearchLoader()

// Wrap it to add debouncing
let debouncedLoader = await DebouncingLoadable(
    wrapping: searchLoader,
    debounceInterval: 0.5 // 500ms
)

// In the view, call load() on every keystroke.
// The wrapper ensures the actual search is only triggered when needed.
TextField("Search...", text: $searchText)
    .onChange(of: searchText) {
        Task { await debouncedLoader.load() }
    }

Limiting concurrency

This example uses a token bucket, see the source at ConcurrencyLimitingLoadable.

Button("Start Downloads") {
    let baseLoader = ParallelDownloadLoader(itemCount: numberOfItems)
    loader = ConcurrencyLimitingLoadable(
        wrapping: baseLoader,
        concurrencyLimit: concurrencyLimit
    )
}

Conclusion

This is what we got

  • Consistency with a reusable solution
  • Type-safety with an enum-based state
  • Customizable views
  • Progress tracking
  • Composable loaders

The source is available on GitHub. The final architecture provides a reusable, powerful, and easy-to-use pattern for managing screen states in any SwiftUI application.