Amazon Web Service (AWS) accelerates a business’s ability to establish and maintain its internet presence by managing hardware infrastructure. This removes the need for companies to manage procurement, maintenance, monitoring, and replacement/upgrade of hardware. Now system administrators are tasked with monitoring these Elastic Compute Cloud (EC2) instances to guarantee availability, scaling, routing optimization, load balancing, software upgrades, and security patches. MarkLogic Data Hub Service makes systems administration in the cloud even easier.

MarkLogic Data Hub Service is a fully-automated cloud service to integrate data from silos. Delivered as a cloud service, it provides on-demand capacity, auto-scaling, automated database operations, and proven enterprise data security. As a result, agile teams can immediately start delivering business value by integrating and curating data for both operational and analytical use.

This tutorial gets users new to AWS up and running quickly by focusing on the specific components you need to get started. Further reading is recommended to fully understand all the technologies involved.


Data Hub Service Architectural Overview

The MarkLogic Data Hub Service on AWS sets up a Virtual Private Cloud (VPC) to allow MarkLogic to manage server and network resources.

Figure 1: Overview of server and network resources managed by MarkLogic. A more detailed architectural diagram is available in the MarkLogic Data Hub Service Admin Online Help.

A VPC is a virtual network of hosts that is isolated from all other virtual networks, providing more control over the hosts inside. Using a VPC reduces the risk of unauthorized communication to/from outside the network, and improves communication between hosts by removing the need to traverse the wider internet to talk to the host sitting nearby.

The MarkLogic Data Hub Service VPC employs auto-scaling configurations to increase and decrease the number of resources as usage spikes and drops automatically. This handles DevOps-related issues for the Data Hub Service user. A load balancer sits in front to coordinate incoming transactions for smooth communication between the ever-changing numbers of MarkLogic servers.

The MarkLogic Data Hub Service VPC can be configured as something publicly accessible, or configured as private, requiring peering to establish a connection with another VPC. For our purposes, we will focus on a publicly-accessible instance of the MarkLogic Data Hub Service. If you are interested in learning more about a private Data Hub Service instance, visit Setting Up MarkLogic Data Hub Service with a Private VPC.

Set Up Accounts

Amazon Web Services

You will need to create an AWS account before creating a MarkLogic Cloud Service account. If you have already signed up for the accounts below, you may skip this section.

Creating an AWS account will display the screen below. Complete the process to create an AWS account, including your payment method and verification of your contact number.

Creating an AWS account

Figure 2: Create an AWS account

Proceed in registering the details required. Make sure to complete the registration including your payment method and verification of your contact number.

Sign up to MarkLogic as a Service

If you have already subscribed to MarkLogic Cloud Service, then you may skip this section of the guide.

Go to the Amazon marketplace and search for “MarkLogic cloud“. Look for the “MarkLogic Cloud Service” entry as shown below and click on “subscribe” on the loaded page.

Click on “subscribe” on the loaded page.

MarkLogic Cloud Service Account

After subscribing, you should get redirected to the MarkLogic Cloud Service homepage. Note that this account is separate from your AWS account, so click on “Create a new Account” to proceed.

Create Network Configuration

A public cluster is easily set up and recommended for people trying to get familiar with the MarkLogic Data Hub Service.

  1. Go to MarkLogic Cloud Services and click on Network in the top navigation.
  2. Click on the “Add Network” button.
  3. Supply the “Name” and preferred “Region.” Do NOT check the VPC peering option.
  4. Click on the “Configure” button.
  5. Wait for the provisioning to complete. Make sure to click the refresh icon every so often.

You should end up with a NETWORK CREATED status, like the following:

Creating a network on https://cloudservices.marklogic.com

Figure 3: Network created confirmation screen

Create the MarkLogic Data Hub Service Instance

On the MarkLogic Cloud Services homepage, click on the “+ Data Hub Service” tab and supply the following information:

Create Data Hub Service

Figure 4: “Create Data Hub Service” interface, with Low Priority service type and Public access selected

“Service Type” of “Low Priority” will have the least amount of charges, but also the least amount of resources. This is recommended for the purposes of exploration and POC’s. This type of service will still have all the other features of DHS except the auto-scaling of resources and high-availability.

“Standard” on the other hand will have auto-scaling in effect. Note that the cost adjusts depending on the capacity you specify. The higher the capacity value, the higher the hourly cost. For the purposes of this guide, we will use the “Low Priority” type of service.

“Private” access is only applicable to networks that have configured “Peering” information. We will discuss details of the “Private Network” in part 2 of this series.

Clicking on “Create” will spawn the MarkLogic VPC as described in the Data Hub Service Architectural Overview. This can take a while, around ten minutes or so. You can hit the “refresh” icon on the upper left to get updates every now and then until you get something like the following:

Data Hub Service Confirmation

Clicking on the service name link will take you to the Data Hub Service details page:

View Data Hub Service Configuration

Figure 5: Data Hub Service details

Manage MarkLogic Data Hub Service Access

The links under “Endpoints” are disabled until users are created. To proceed in using your Data Hub Service instance, you need to specify users. Note that the service admin does not have access to all actions by default. Click on the “Internal” button to expose the “Manage Users” button. Click on the “Manager Users” button to add users with specific roles.

Users and roles for the Data Hub Service

Figure 6: Users and roles for the Data Hub Service

Note that the users created in Figure 6 above will not have SSH access to your servers. These users are MarkLogic accounts created to connect to the endpoints. These roles are described as follows:

Role Can do…
Flow Developer Can load modules into MarkLogic modules database, load TDE templates, and update indexes.

Basically your gradle task executor.

Flow Operator Can do the ingest and run your flows.
Endpoint Developer A subset of “Flow Developer”. Can load modules that would not overwrite any existing modules that he/she did not upload.

Cannot upload TDE templates nor update indexes. Meant as “Data Service” developer.

Endpoint User For users that would consume the “Data Services” developed by the “Endpoint Developer”
ODBC User Meant to be used for port 5432. Typically your business intelligence tool credentials or perhaps your Koop instance credentials.
Service Security Admin For users that would configure your external security via LDAP
Figure 7: Data Hub Service User Roles

The following table details the available endpoints provided by the MarkLogic Data Hub Service:

Endpoints Details Description
Manage Port: 8002

Content Database: App-Services

Server: Curation

Port to be used when loading modules, updating indexes and uploading your TDE templates. This is your standard Manage server as available in your local install of MarkLogic. Links to Monitoring Dashboard and History is available here along with Configuration Manager and QConsole.
REST Port: 8004

Content Database: data-hub-MODULES

Server: Curation

Port to be used to view your loaded modules.

Supports MarkLogic’s built in REST API.

Ingest Port: 8005

Content Database: data-hub-STAGING

Server: Curation

XDBC App server to be used by MLCP.
Curation Staging REST  

Port: 8010

Content Database:

Port to be used when running your ingest and harmonization flows.

Supports MarkLogic’s built in REST API. This allows you to confirm what got loaded to your STAGING database.

Curation Final REST Port: 8011

Content Database: data-hub-FINAL

Server: Curation

Port to be used when running your ingest targeted towards the FINAL DB and harmonization flows meant to pick up documents from the FINAL DB.

Supports MarkLogic’s built in REST API. This allows you to confirm available documents in your FINAL database.

Jobs Port: 8013

Content Database: data-hub-JOBS

Server: Curation

This port allows the user to view jobs and traces.
Query Console Port: 8002

Content Database: App-Services

Server: Curation

Available for Low Priority only.
Analytics REST Port: 8011

Content Database: data-hub-FINAL

Server: Analytics

Supports MarkLogic’s built in REST API. This allows you to confirm what got loaded to your FINAL database.
Operations REST Port: 8011

Content Database: App-Services

Server: Operations

Supports MarkLogic’s built in REST API. This allows you to confirm what got loaded to your FINAL database.

The goal is to separate operations-related transactions from report-related functions

ODBC Port: 5432

Content Database: data-hub-FINAL

Server: Curation

Port to be used by your BI tools.

Figure 8: Available Data Hub Service endpoints (please note that “Query Console” is only applicable to a “Low Priority” type of Data Hub Service instance)

Developer Access to the Data Hub Service

The following sections will provide some guidelines on what needs to be done and what can be done to load your modules to the Data Hub Service instance.

Note that you cannot currently use the Data Hub Developer Quickstart to directly develop on top of the MarkLogic Data Hub Service instance.

Confirm Initial Configuration

To confirm the availability of the initial configuration, load the Configuration Manager application at the “Manage” endpoint. You would have to adjust the path to include /manage. When prompted for credentials, use the configured user account with the “Flow Developer” role.

Project configuration

Gradle configuration for your DataHub project can be downloaded by doing the following steps:

  1. Click on the “Action” button
  2. Hover over “Gradle Config”
  3. Click on “Download” and save to your project folder

Gradle tasks

The table below lists the Gradle tasks available. These tasks are for a Data Hub Framework project, not your vanilla ml-gradle project.

Task Purpose
hubInstall Install DHF modules.
mlLoadModules Deploy your code custom code beyond the default code of DHF
mlUpdateIndexes Deploy your indexes as defined in <project-root>/src/hub-internal-config/databases/your-db.json
mlDeployViewSchemas Deploy your TDE templates as defined in <project-root>/src/hub-internal-config/schemas/tde/your-template.json
hubRunFlow Run your harmonization flow.
hubDeployAsSecurityAdmin Deploy custom roles and assign them to users.

Figure 9: Gradle tasks and purposes

The mlDeploy task is unavailable for MarkLogic Data Hub Service users, since users do not have the full admin role.

Access for Data Services

“Analytics REST” (<ANALYTICS>:8011) and “Operational REST” (<OPERATIONAL>:8011) support both the built-in MarkLogic REST API and the Data Services First (DSF) endpoints. DSF modules would have to reside in a ‘ds’ folder as noted in the docs. Conversely, these endpoints would have to be accessed with a ‘/ds’ path prefix.

If you run into issues using MarkLogic Data Hub Service, contact Support. MarkLogic engineers and enthusiasts are also active in Stack Overflow, just tag your questions as ‘marklogic’.

Ready for more? If you are interested in learning more about a private Data Hub Service instance, visit Setting Up MarkLogic Data Hub Service with a Private VPC.

Learn More

Data Hub Service with a Private VPC

Learn how to configure a private MarkLogic Data Hub Service VPC and required peering.

Documentation

Dig deeper into what Data Hub Service is, what is needed to get running, and how to use Data Hub Service in AWS.

Data Hub Service

Explore all technical resources available for MarkLogic Data Hub Service.

This website uses cookies.

By continuing to use this website you are giving consent to cookies being used in accordance with the MarkLogic Privacy Statement.