How LTE and UMTS transition away from full-power state?

747 Views Asked by At

I'm trying to understand how does LTE and UMTS/HSPA+ transition away from the full power state, and what kind of resources are tied up at the tower when the radios of the mobile device are operating at full power.

I've looked at Ch. 7 and 8 of High Performance Browser Networking, and also at an AT&T Research article mentioned in the book, but they don't address this question directly.

  • Is the timeout for transition from full-power to half-power in UMTS (DCH to FACH) almost always 5s, or where does the 5s value (mentioned in the AT&T Research link above) come from?

  • Is the UMTS timeout for transition away from the full power DCH state reset when minor traffic is sent prior to the expiration of the timeout, or does it depend on whether it would be deemed sufficient for such minor traffic to be subsequently handled through the shared low-speed channel in the low-bandwidth half-power FACH state?

  • What's the timeout in LTE for the transition away from the full-power state?

  • What resources are tied up from the tower in UMTS and LTE full power states, with what implications for the carrier?

  • How much the transition away from the full-power states is dictated by the battery consumption concerns of the mobile device, as opposed to the actual resource conservation of the towers by the carrier? For example, if the device is connected to the charger, would it ever be allowed or make sense to always operate the radio of the mobile device in the full power state with UMTS and LTE?

3

There are 3 best solutions below

0
On

UMTS:

The delay from transitioning to DCH to FACH is governed by a timer called T1, and in this case the network has configured it to be 5 seconds. Whatever the value is, it is a compromise between device battery consumption and managing the signalling load between the network elements.

For mobile applications that exchange small packets periodically but infrequently a long timer causes the device to remain in high power state for many additional and unnecessary seconds, draining battery.

Prior to 3GPP Release 8 this issue was addressed by device manufacturers so that instead of waiting for the network initiated transition to FACH, the device would send a signalling connection release indication (SCRI) after it was done sending and receiving data. This would place the device in idle mode, the lowest power consumption state.

However, this solution had an downfall: the SCRI would cause unnecessary signalling load between the network elements when the RAB was being released and set up again frequently.

This was addressed in Release 8 so that a specific cause value (UE requested PS data session end) in SCRI explicitly states to the network that the device is done sending and receiving data. This allows the network to distinguish between different reasons for releasing the connection, and if it is happening too frequently, it could deny the request to release the connection and avoid signalling load.

See Fast Dormancy Best Practices by GSMA for more info.

LTE:

LTE is simpler as there are only two RRC states, connected and idle. The timers are still controlled by the network, but remaining in RRC connected state in LTE is not as harmful to the UE as discontinuous reception (DRX) helps to keep the power consumption lower. Also, transitioning between the two states does not cause as much signalling load in LTE, as it was a goal of the design.

0
On

RRC-DCH state is always have highest power state. The RRC-FACH state is lower than the DCH state. Then comes the RRC-URA state which is least of them all.

The Fast Dormancy helps to switch the Ue from highest to Lowest power state. In Power Level DCH>FACH>URA in connected mode.

0
On

Its a weird claim that RRC state from DCH to FACH will take 5s, it is usually faster than that. The longer it hangs the more RRC resources the network ties to your RRC instance, it is in the best interest of good design to make the RRC state hang time as short as possible as it saves computing and spectral resource.

Thus going back to your main question RRC STATE (CELL DCH) consumes the most power, RRC STATE (CELL FACH) consumes bursts of power and RRC STATE (IDLE) consumes the least. The burst comes from cell reselection states and RRC connect establishment requests.

Here is a proper rrc state diagram (http://images.books24x7.com/bookimages/id_6399/fig209_01.jpg)

Here is an rrc state power consumption graph i found on google image (http://3.bp.blogspot.com/-NoMR5oNLbCs/T3H1i0bsdgI/AAAAAAAAAW0/pv0G-tG0auk/s1600/Power+Consumption+Vs+RRC+states.png)

Now if the article's data is correct, what I can deduce is that the measurements were made that the RRC state machine in the Ue was in "hysteresis" that it took the Ue 5s to decide on the next RRC state. Then it may be a network design and degradation issue.