GS2-Datastore

Binary data storage feature

GS2-Datastore allows you to store arbitrary binary data on the server.

Uploaded data can have access permissions set: choose from public (open to everyone), protected (visible only to specified user IDs: up to 100), or private (yourself only).

GS2-Datastore is primarily intended for uploading data such as UGC or racing-game ghost data, and is not necessarily suitable for storing player user data.

This is because storing the quantity of possessions as binary data would allow item duplication via tampering of the save data or the application binary, or play with modded applications that illegally inflate acquired quantities. For something like the quantity of possessions, using a dedicated microservice such as GS2-Inventory prevents such cheating.

This is not to deny storing user data that does not affect game balance even if tampered with, such as configuration values.

Use cases

Typical use cases for GS2-Datastore are as follows:

  • Storing UGC content such as screenshots or replays posted by users
  • Sharing ghost data in racing games
  • Cloud sharing of photos captured in photo mode
  • Storing user data such as game settings whose tampering does not affect game balance
  • Publishing and sharing custom map data for stages

Access scope

Data objects have three types of access scope, used depending on the purpose.

Scope Description Main use
public Downloadable by all players Publishing UGC content, ghost sharing
protected Downloadable only by users specified in allowUserIds (up to 100) Sharing only with friends
private Downloadable only by the user who uploaded it Personal save data, configuration values

The scope and allowed users can also be changed later with UpdateDataObject.

Architecture

GS2-Datastore manages metadata that records the storage location of binary data, and uses external cloud storage for the actual binary data storage. Therefore, the upload/download process requires multiple steps.

This process flow is wrapped behind high-level APIs in game-engine SDKs and is nothing you need to be concerned with, but SDKs for various programming languages do not provide a high-level API, so users have to handle the multiple steps themselves.

Upload process

sequenceDiagram
  actor Player as Player
  participant Namespace as GS2-Datastore#Namespace
  participant Storage as Cloud Storage
  Player->>Namespace: PrepareUpload
  Namespace-->>Player: Cloud Storage URL
  Player->>Storage: Upload Payload
  Storage-->>Player: OK
  Player->>Namespace: DoneUpload
  Namespace->>Storage: Check exists
  Namespace-->>Player: OK

Download process

sequenceDiagram
  actor Player as Player
  participant Namespace as GS2-Datastore#Namespace
  participant Storage as Cloud Storage
  Player->>Namespace: PrepareDownload
  Namespace-->>Player: Cloud Storage URL
  Player->>Storage: Download
  Storage-->>Player: Payload

Downloading during the upload process

GS2-Datastore can update data that has already been uploaded. During the time between calling Prepare ReUpload and Done Upload, the old file before the update remains downloadable, so no half-finished data will be downloaded.

Retrieving past versions of uploaded data

GS2-Datastore provides access to past versions of data for the past 30 days. By retrieving the update history of a data object (DataObjectHistory), you can obtain the generation ID for each past generation and download data by specifying that generation ID.

This also applies to deleted data; the data is actually deleted 30 days after a deletion request. However, this condition may not apply when data has been deleted due to legal requirements.

Data size and status

The maximum size of a single data object is 10MB. During upload the status is UPLOADING; once complete it becomes ACTIVE; and after a deletion request it transitions to DELETED. Data in the DELETED state can be restored within 30 days using restoreDataObject.

stateDiagram-v2
  [*] --> UPLOADING: PrepareUpload
  UPLOADING --> ACTIVE: DoneUpload
  ACTIVE --> UPLOADING: PrepareReUpload
  ACTIVE --> DELETED: DeleteDataObject
  DELETED --> ACTIVE: RestoreDataObject (within 30 days)
  DELETED --> [*]: After 30 days

Main attributes of a data object

Attribute Description
dataObjectId Unique ID (GRN) of the data object
name Data object name. Unique per user
userId ID of the user who uploaded
scope Access scope (public / protected / private)
allowUserIds List of user IDs allowed to reference under the protected scope
platform Platform information used for the upload
status Data state (UPLOADING / ACTIVE / DELETED)
generation Identifier of the current generation
previousGeneration Identifier of the previous generation

Script Triggers

Event triggers can be configured to invoke GS2-Script before and after the upload completion notification of a data object. Synchronous execution can be used to reject the completion report, and asynchronous execution enables external integration through Amazon EventBridge.

Main event triggers and script setting names are:

  • doneUploadScript (completion notification: doneUploadDone): before and after the upload completion report

By rejecting an upload in a synchronous script, operations such as inspecting the contents of an image with an external service and refusing registration if it is inappropriate content are possible.

Implementation Example

Uploading data

Uploading data consists of three steps — PrepareUpload → PUT to cloud storage → DoneUpload — but in game-engine SDKs it can be completed with a single UploadAsync call.

    var result = await gs2.Datastore.Namespace(
        namespaceName: "namespace-0001"
    ).Me(
        gameSession: GameSession
    ).UploadAsync(
        name: "dataObject-0001",
        scope: "public",
        data: data
    );

    var item = await result.ModelAsync();
    var dataObjectId = item.DataObjectId;
    const auto Domain = Gs2->Datastore->Namespace(
        "namespace-0001" // namespaceName
    )->Me(
        AccessToken
    );
    const auto Future = Domain->Upload(
        "dataObject-0001", // name
        data, // data
        "public" // scope
    );
    Future->StartSynchronousTask();
    if (Future->GetTask().IsError()) return false;

Re-uploading existing data

Use re-upload when you want to write new binary data to a data object with the same name. During re-upload, the old data before the update can still be retrieved until DoneUpload is called.

    var domain = await gs2.Datastore.Namespace(
        namespaceName: "namespace-0001"
    ).Me(
        gameSession: GameSession
    ).DataObject(
        dataObjectName: "dataObject-0001"
    ).ReUploadAsync(
        data: newData
    );
    const auto Domain = Gs2->Datastore->Namespace(
        "namespace-0001" // namespaceName
    )->Me(
        AccessToken
    )->DataObject(
        "dataObject-0001" // dataObjectName
    );
    const auto Future = Domain->ReUpload(
        NewData
    );
    Future->StartSynchronousTask();
    if (Future->GetTask().IsError()) return false;

Download data (specify data object ID)

    var binary = await gs2.Datastore.Namespace(
        namespaceName: "namespace-0001"
    ).Me(
        gameSession: GameSession
    ).DownloadAsync(
        dataObjectId: dataObjectId
    );
    const auto Domain = Gs2->Datastore->Namespace(
        "namespace-0001" // namespaceName
    )->Me(
        AccessToken
    );
    const auto Future = Domain->Download(
        dataObjectId // dataObjectId
    );
    Future->StartSynchronousTask();
    if (Future->GetTask().IsError()) return false;

Download data (specify user ID and data object name)

    var binary = await gs2.Datastore.Namespace(
        namespaceName: "namespace-0001"
    ).User(
        userId: "user-0001"
    ).DataObject(
        dataObjectName: "dataObject-0001"
    ).DownloadByUserIdAndDataObjectNameAsync(
    );
    const auto Domain = Gs2->Datastore->Namespace(
        "namespace-0001" // namespaceName
    )->User(
        "user-0001" // userId
    )->DataObject(
        "dataObject-0001" // dataObjectName
    );
    const auto Future = Domain->DownloadByUserIdAndDataObjectName(
    );
    Future->StartSynchronousTask();
    if (Future->GetTask().IsError()) return false;

Download data uploaded by yourself (specify data object name)

    var binary = await gs2.Datastore.Namespace(
        namespaceName: "namespace-0001"
    ).Me(
        gameSession: GameSession
    ).DataObject(
        dataObjectName: "dataObject-0001"
    ).DownloadOwnAsync(
    );
    const auto Domain = Gs2->Datastore->Namespace(
        "namespace-0001" // namespaceName
    )->Me(
        AccessToken
    )->DataObject(
        "dataObject-0001" // dataObjectName
    );
    const auto Future = Domain->DownloadOwn(
    );
    Future->StartSynchronousTask();
    if (Future->GetTask().IsError()) return false;

Get a list of data you uploaded

    var items = await gs2.Datastore.Namespace(
        namespaceName: "namespace-0001"
    ).Me(
        gameSession: GameSession
    ).DataObjectsAsync(
    ).ToListAsync();
    const auto It = Gs2->Datastore->Namespace(
        "namespace-0001" // namespaceName
    )->Me(
        AccessToken
    )->DataObjects();
    TArray<Gs2::UE5::Datastore::Model::FEzDataObjectPtr> Result;
    for (auto Item : *It)
    {
        if (Item.IsError())
        {
            return false;
        }
        Result.Add(Item.Current());
    }

Change the access scope of a data object

Using UpdateDataObject, you can change the access scope and the list of allowed users later. For example, it is possible to first create a data object as private and switch it to public once it is ready to be shared.

    var domain = await gs2.Datastore.Namespace(
        namespaceName: "namespace-0001"
    ).Me(
        gameSession: GameSession
    ).DataObject(
        dataObjectName: "dataObject-0001"
    ).UpdateDataObjectAsync(
        scope: "protected",
        allowUserIds: new [] { "user-0002", "user-0003" }
    );
    const auto Domain = Gs2->Datastore->Namespace(
        "namespace-0001" // namespaceName
    )->Me(
        AccessToken
    )->DataObject(
        "dataObject-0001" // dataObjectName
    );
    const auto Future = Domain->UpdateDataObject(
        "protected", // scope
        { "user-0002", "user-0003" } // allowUserIds
    );
    Future->StartSynchronousTask();
    if (Future->GetTask().IsError()) return false;

Delete a data object

Within 30 days after the deletion request, the data can be restored using RestoreDataObject.

    await gs2.Datastore.Namespace(
        namespaceName: "namespace-0001"
    ).Me(
        gameSession: GameSession
    ).DataObject(
        dataObjectName: "dataObject-0001"
    ).DeleteDataObjectAsync(
    );
    const auto Domain = Gs2->Datastore->Namespace(
        "namespace-0001" // namespaceName
    )->Me(
        AccessToken
    )->DataObject(
        "dataObject-0001" // dataObjectName
    );
    const auto Future = Domain->DeleteDataObject(
    );
    Future->StartSynchronousTask();
    if (Future->GetTask().IsError()) return false;

Get update history of a data object

By retrieving past generations, you can implement rollback or browse past replays.

    var items = await gs2.Datastore.Namespace(
        namespaceName: "namespace-0001"
    ).Me(
        gameSession: GameSession
    ).DataObject(
        dataObjectName: "dataObject-0001"
    ).DataObjectHistoriesAsync(
    ).ToListAsync();
    const auto It = Gs2->Datastore->Namespace(
        "namespace-0001" // namespaceName
    )->Me(
        AccessToken
    )->DataObject(
        "dataObject-0001" // dataObjectName
    )->DataObjectHistories();
    TArray<Gs2::UE5::Datastore::Model::FEzDataObjectHistoryPtr> Result;
    for (auto Item : *It)
    {
        if (Item.IsError())
        {
            return false;
        }
        Result.Add(Item.Current());
    }

Restricting access to older generations of data

When downloading data, a generation ID is specified to identify the exact file. Including the generation ID in a download request ensures that you reliably download the data as it was at the time it was listed.

However, it is not always desirable for old generation data to remain accessible indefinitely. For this reason, there is an option that, except for the owner of the data, only allows downloading older generation data within 60 minutes of an update and only for the immediately previous generation.

Detailed Reference