AWS AppSync – HTTP Resolver

AWS AppSync is an enterprise level, fully managed GraphQL service with real-time data synchronization and offline programming features.
AWS AppSync automatically updates the data in web and mobile applications in real time, and updates data for offline users as soon as they reconnect. AWS AppSync makes it easy to build collaborative mobile and web applications that deliver responsive, collaborative user experiences.

AWS AppSync enables you to use supported data sources (AWS Lambda, Amazon DynamoDB, or Amazon Elasticsearch Service) to perform various operations, in addition to any arbitrary HTTP endpoints to resolve GraphQL fields.

Here you can read more about AWS AppSync:

AppSync now supports HTTP endpoints as data sources: HTTP resolver.
It enables customers to use their existing backend services that use REST APIs with AppSync to leverage the power of GraphQL interfaces.

If you are not familiar with GraphQL, I suggest you these resources:

In this post we are going to see how to create a new AppSync API, a new HTTP data source, a new HTTP resolver and how to run a simple GraphQL statement to query our backend Rest API service.

To run the example I used Python 3.6.3. You need the Python AWS SDK Boto3.

If you have an older version of Boto3, please update it (you need at least version boto3-1.7.59).

Given our REST Api endpoint we are going to build an API to leverage the power of GraphQL interfaces.
We are going to use this dummy Rest API – Json Placeholder:

Request:

Response:

These GraphQL Types describe our data:

Let’s see how to create a new AppSync API.
Fist of all, create a new AWS AppSync client and then create a new API:

When you create a new API you need to specify the API name and the API authentication type. In the example I used the API_KEY authentication type. Here you can read more about authentication types: AWS AppSync Security .

Create a new API key:

Create new GraphQL types:

Data sources and resolvers are how AWS AppSync translates GraphQL requests and fetches information from your AWS resources.
AWS AppSync has support for automatic provisioning and connections with certain data source types. You can use a GraphQL API with your existing AWS resources or build data sources and resolvers. This section takes you through this process in a series of tutorials for better understanding how the details work and tuning options.

Create a new HTTP data source based on our API:

The next step is to create the HTTP Resolver.
A resolver uses a request mapping template to convert a GraphQL expression into a format that a data source can understand. Mapping templates are written in Apache Velocity Template Language (VTL).

This is how our request mapping template looks like.

This is how our response mapping template looks like.

Create the new resolver:

And we are done! We created an API that use REST APIs with AWS AppSync to leverage the power of GraphQL interfaces.


cURL request:

Response:

Combine Amazon Translate with Elasticsearch and Skedler to build a cost-efficient multi-lingual omnichannel customer care – Part 1 and 2 – Skedler Blog

I’ve just published a new blog post on the Skedler Blog.

In this two-part blog post, we are going to present a system architecture to translate customer inquiries in different languages with AWS Translate, index this information in Elasticsearch 6.2.3 for fast search, visualize the data with Kibana 6.2.3, and automate reporting and alerting using Skedler.

The components that we are going to use are the following:

  • AWS API Gateway
  • AWS Lambda
  • AWS Translate
  • Elasticsearch 6.2.3
  • Kibana 6.2.3
  • Skedler Reports and Alerts

System architecture:

You can read the full post – Part 1 – here: Combine Amazon Translate with Elasticsearch and Skedler to build a cost-efficient multi-lingual omnichannel customer care – Part 1.

Part 2 – here: Combine Amazon Translate with Elasticsearch and Skedler to build a cost-efficient multi-lingual omnichannel customer care – Part 2 of 2.

Please share the post and let me know your feedbacks.

ASP.NET Core + Azure Text Analysis + AWS Text Analysis – Twitch Live Stream

I am live on Twitch at 20:00 on Wednesday the 28th of March with Emanuele Bartolesi. We will be talking about ASP.NET Core, Text Analysis on AWS and Azure.

Here the link to the Twitch stream: Twitch Stream.
After the live streaming we will upload the video to Youtube.

Hope to see you there!

Extract business insights from audio using AWS Transcribe, AWS Comprehend and Elasticsearch – Part 1 and 2- Skedler Blog

I’ve just published a new blog post on the Skedler Blog.
In this two-part blog post, we are going to present a system architecture to convert audio and voice into written text with AWS Transcribe, extract useful information for quick understanding of content with AWS Comprehend, index this information in Elasticsearch 6.2 for fast search and visualize the data with Kibana 6.2. In Part I, you can learn about the key components, architecture, and common use cases. In Part II, you can learn how to implement this architecture.

The components that we are going to use are the following:

  • AWS S3 bucket
  • AWS Transcribe
  • AWS Comprehend
  • Elasticsearch 6.2
  • Kibana 6.2
  • Skedler Reports and Alerts

System architecture:

You can read the full post – Part 1 – here: Extract business insights from audio using AWS Transcribe, AWS Comprehend and Elasticsearch – Part 1.

Part 2 – here: Extract business insights from audio using AWS Transcribe, AWS Comprehend and Elasticsearch – Part 1.

Please share the post and let me know your feedbacks.

PyCon Nove – Speaker

PyCon Nove is the ninth edition of the Italian Python Conference.
The event will take place in Florence, the 19th – 22nd April 2018.

During the event I will be speaking about “Monitoring Python Flask Application Performances with Elastisearch and Kibana”.

Here you can find the complete schedule: Pycon Nove Schedule and the abstract of my talk.

You can reserve your spot here: Pycon Registration.

PS: after the conference on Friday evening don’t miss the Elastic Meetup!

Hope to see you there 🙂