Serializers¶
Serializers can be attached to backends in order to serialize/deserialize data sent and retrieved from the backend. This allows to apply transformations to data in case you want it to be saved in a specific format in your cache backend. For example, imagine you have your Model
and want to serialize it to something that Redis can understand (Redis can’t store python objects). This is the task of a serializer.
To use a specific serializer:
>>> from aiocache import Cache
>>> from aiocache.serializers import PickleSerializer
cache = Cache(Cache.MEMORY, serializer=PickleSerializer())
Currently the following are built in:
NullSerializer¶
StringSerializer¶
PickleSerializer¶
JsonSerializer¶
MsgPackSerializer¶
In case the current serializers are not covering your needs, you can always define your custom serializer as shown in examples/serializer_class.py
:
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 | import asyncio import zlib import redis.asyncio as redis from aiocache import Cache from aiocache.serializers import BaseSerializer class CompressionSerializer(BaseSerializer): # This is needed because zlib works with bytes. # this way the underlying backend knows how to # store/retrieve values DEFAULT_ENCODING = None def dumps(self, value): print("I've received:\n{}".format(value)) compressed = zlib.compress(value.encode()) print("But I'm storing:\n{}".format(compressed)) return compressed def loads(self, value): print("I've retrieved:\n{}".format(value)) decompressed = zlib.decompress(value).decode() print("But I'm returning:\n{}".format(decompressed)) return decompressed cache = Cache(Cache.REDIS, serializer=CompressionSerializer(), namespace="main", client=redis.Redis()) async def serializer(): text = ( "Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt" "ut labore et dolore magna aliqua. Ut enim ad minim veniam, quis nostrud exercitation" "ullamco laboris nisi ut aliquip ex ea commodo consequat. Duis aute irure dolor in" "reprehenderit in voluptate velit esse cillum dolore eu fugiat nulla pariatur. Excepteur" "sint occaecat cupidatat non proident, sunt in culpa qui officia deserunt mollit" "anim id est laborum.") await cache.set("key", text) print("-----------------------------------") real_value = await cache.get("key") compressed_value = await cache.raw("get", "main:key") assert len(compressed_value) < len(real_value.encode()) async def test_serializer(): await serializer() await cache.delete("key") await cache.close() if __name__ == "__main__": asyncio.run(test_serializer()) |
You can also use marshmallow as your serializer (examples/marshmallow_serializer_class.py
):
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 | import random import string import asyncio from typing import Any from marshmallow import fields, Schema, post_load from aiocache import Cache from aiocache.serializers import BaseSerializer class RandomModel: MY_CONSTANT = "CONSTANT" def __init__(self, int_type=None, str_type=None, dict_type=None, list_type=None): self.int_type = int_type or random.randint(1, 10) self.str_type = str_type or random.choice(string.ascii_lowercase) self.dict_type = dict_type or {} self.list_type = list_type or [] def __eq__(self, obj): return self.__dict__ == obj.__dict__ class RandomSchema(Schema): int_type = fields.Integer() str_type = fields.String() dict_type = fields.Dict() list_type = fields.List(fields.Integer()) @post_load def build_my_type(self, data, **kwargs): return RandomModel(**data) class Meta: strict = True class MarshmallowSerializer(BaseSerializer): def __init__(self, *args: Any, **kwargs: Any): super().__init__(*args, **kwargs) self.schema = RandomSchema() def dumps(self, value: Any) -> str: return self.schema.dumps(value) def loads(self, value: str) -> Any: return self.schema.loads(value) cache = Cache(serializer=MarshmallowSerializer(), namespace="main") async def serializer(): model = RandomModel() await cache.set("key", model) result = await cache.get("key") assert result.int_type == model.int_type assert result.str_type == model.str_type assert result.dict_type == model.dict_type assert result.list_type == model.list_type async def test_serializer(): await serializer() await cache.delete("key") if __name__ == "__main__": asyncio.run(test_serializer()) |
By default cache backends assume they are working with str
types. If your custom implementation transform data to bytes, you will need to set the class attribute encoding
to None
.