The current model of receiving updates through a package manager/update manager relies on a client/server model whereas the client pings the update servers to see if updates are available and grabs them from the designated mirror. Time and time again this model has shown flaws including slow download speeds and high bandwidth and hardware costs.
So how can this model potentially be improved? During UDS I spoke with a Transmission Developer who works for Canonical about distributing all updates through a BitTorrent model in which the only servers are a cloud which does initial seeding of packages through the BitTorrent protocol while other clients (end users) update managers receive those new packages and then re-seed to a certain ratio set by the end-user.
Although this is simply an idea I do think it feasible and would definitely make distributing updates much faster globally while reducing the related IT costs for mirror providers and Canonical itself and clearly if such an idea was turned into an open source project and made a reality it could be shared with other projects and distributions to speed up delivery of updates to their communities while reducing costs and potentially also minimizing energy consumption and other aspects.