Tuesday 3 September 2013

1.description:


LRUCache is a caching strategy where we use Least Recently Used (LRU) eviction policy when the cache is full and we want to add more new data to the cache. LRU cache is used in many caching applications. For example memcached uses LRU cache.

LRUCache cache was introduced in API level 12 (3.1) but is available through the Support Library back to 1.6. It is a generic class which we can strongly type when subclassing it
  2. code:

public class LruMemoryCache extends LruCache<String, Bitmap> 
{
    private final Context context;
     
    public LruMemoryCache(Context context)
    {
        super( 10 );
        this.context = context;
    }
 
    @Override
    protected Bitmap create( String key )
    {
        return Utils.loadAsset( context, key );
    }
}


We’ve defined a cache which will use String keys to index Bitmap objects, much the same as we did in the previous part of this series. The implementation is actually pretty straightforward: In the constructor we call the constructor of the base class to set the size of our cache to 10 items; and we override create which calls the utility method that we defined in part 2.
Using this cache is even simpler still. We first create an instance of our cache:

 
lruMemCache = new LruMemoryCache( getApplicationContext());


and then obtain items using the get() method:

Bitmap bitmap = lruMemCache.get( ASSET_NAME );

So, how does it work? LruCache maintains a list of cached items. Whenever we try and access a specific item, it tries to find it in the cache. If the item already exists, it is moved to the head of the list, and returned. If the item does not exist, then our create() method is called to create a new instance, and this is added to the head of the list before being returned. As a new item is added to the list LruCache checks whether the addition would cause the list to exceed the size that we declared earlier. If not, then it simply adds it, but if so it first deletes the item at the tail of the list. Thus we now have a cache which is not controlled by garbage collection, and which is optimised to keep the frequently accessed items in the cache.

So, how we’ve got some pretty useful functionality in just a few lines of code, but LruCache gives us more. Images can be tricky beasts because they vary in size quite significantly. Holding an arbitrary quantity of images can result in the total size of the cache varying quite enormously depending on the sizes of the individual images. So what if we want to limit the memory usage of our cache? LruCache allows us to override the way that the cache size is calculated. We do this by first changing how the size of each cached item is calculated:
@Override
protected int sizeOf( String key, Bitmap value )
{
    return value.getByteCount();
}

The default implementation of sizeOf() simply returns 1, so this provides the default behaviour that we have already seen with a satic number of items. We have overridden this to return the size of the bitmap instead.
Next we need to change how we specify the size of our cache by changing the constructor:



public LruMemoryCache(Context context)
{
    super( 5 * 1024 * 1024 );
    this.context = context;
}

Here we are specifying a maximum cache size of 5MiB. Whenever the cache exceeds this, items will be evicted from the tail until the size drops below 5MiB once again. We can actually do whatever we want provided that we match the units that we’re using in the constructor to specify the maximum size, to those used in our sizeOf.
This is a much better cache implementation. It gives us greater control and cached items will potentially last much longer than the next GC.
Remember to clear your cache if memory is running low. If onLowMemory() is called on your Activity, it is much better to call evictAll() on your cache and allow it to be rebuilt than it is for your app to crash with an OutOfMemoryError!
LruCache can also be used to manage a cache on the SD card. Rather than storing bitmaps in the cache, you can store File objects instead, calculate the size usingfile.length(), and override entryRemoved() in LruCache to delete the physical file on the SD card when its File is evicted from the cache.
That completes our look at caching. Hopefully you will have a better understanding of some of the tools available to help you to select the correct approach to meet your caching requirements.


3.note that:
    you can visit these links for better understanding and deep information.

4.conclusion:
  • know how to release resources using LRU Cache.
  • know how to avoid out of memory Error  using LRU Cache.
5.about the post:
  •  The code seems to explain itself due to comments, and is very easy to understand.
  •  Don’t mind to write a comment whatever you like to ask, to know,to suggest or recommend.
  •  Hope you enjoy it!
Cheers,
Hamad Ali Shaikh