As industrial and manufacturing firms embark on their digital transformation journey to leverage superior applied sciences for elevated effectivity, productiveness, high quality management, flexibility, value discount, provide chain optimization, and aggressive benefit within the quickly evolving digital period. AWS clients within the manufacturing and industrial house, more and more leverage AWS IoT SiteWise to modernize their industrial knowledge technique and unlock the complete potential of their operational expertise. AWS IoT SiteWise empowers you to effectively accumulate, retailer, manage, and monitor knowledge from industrial tools at scale.It additionally allows you to derive actionable insights, optimize operations, and drive innovation via data-driven choices.
The journey usually begins with a Proof of Worth (PoV) case research in a growth surroundings. This strategy gives you with a chance to discover how knowledge assortment and asset modelling with an answer that features AWS IoT SiteWise might help. As you turn out to be snug with the answer, you possibly can scale extra belongings or services right into a manufacturing surroundings from staging over time. This weblog put up gives an outline of the structure and pattern code emigrate the belongings and knowledge in AWS IoT SiteWise from one deployment to a different, whereas guaranteeing knowledge integrity and minimizing operational overhead.
Through the PoV part, you determine knowledge ingestion pipelines to stream close to real-time sensor knowledge from on-premises knowledge historians, or OPC-UA servers, into AWS IoT SiteWise. You possibly can create asset fashions that digitally symbolize your industrial tools to seize the asset hierarchy and demanding metadata inside a single facility or throughout a number of services. AWS IoT SiteWise gives API operations that can assist you import your asset mannequin knowledge (metadata) from numerous methods in bulk, similar to course of historians in AWS IoT SiteWise at scale. Moreover, you may outline widespread industrial efficiency indicators (KPIs) utilizing the built-in library of operators and features accessible in AWS IoT SiteWise. You too can create customized metrics which might be triggered by tools knowledge on arrival or computed at user-defined intervals.
Establishing a number of non-production environments on a manufacturing facility ground might be difficult as a consequence of legacy networking and strict rules related to the plant ground – along with delays in {hardware} procurement. Many purchasers transition the identical {hardware} from non-production to manufacturing by designating and certifying the {hardware} for manufacturing use after validation completes.
To speed up and streamline the deployment course of, you want a well-defined strategy emigrate their IoT SiteWise assets (asset, hierarchies, metrics, transforms, time-series, and metadata) between AWS accounts as a part of your customary DevOps practices.
AWS IoT SiteWise shops knowledge throughout storage tiers that may assist coaching machine studying (ML) fashions or historic knowledge evaluation in manufacturing. By this blogpost we offer a top level view about easy methods to migrate the asset fashions, asset hierarchies, and historic time collection knowledge from the event surroundings to the staging and manufacturing environments which might be hosted on AWS.
Let’s start by discussing the technical points of migrating AWS IoT SiteWise assets and knowledge between AWS accounts. We offer a step-by-step information on easy methods to export and import asset fashions and hierarchies utilizing IoT SiteWise APIs. We additionally focus on easy methods to switch historic time collection knowledge utilizing Amazon Easy Storage Service (Amazon S3) and the AWS IoT SiteWise BatchPutAssetPropertyValue API operation.
By following this strategy, you may promote your AWS IoT SiteWise setup and knowledge via the event lifecycle as you scale your industrial IoT functions into manufacturing. The next is an outline of the method:
growth account
) by working a bulk export job. You need to use filters to export the fashions and/or belongings.staging account)
by working a bulk import job. The import recordsdata should observe the AWS IoT SiteWise metadata switch job schema.growth account
.staging account
.The info migration steps in our resolution make the next assumptions:
staging account
doesn’t have AWS IoT SiteWise belongings or fashions configured the place it makes use of the identical identify or hierarchy because the growth account
.growth account
to the staging account
.growth account
to the staging account
.Determine 1: Structure emigrate AWS IoT SiteWise metadata throughout AWS accounts
AWS IoT SiteWise helps bulk operations with belongings and fashions. The metadata bulk operations assist to:
growth account
by working a bulk export job. You possibly can select what to export once you configure this job. For extra data, see Export metadata examples. AWS IoT SiteWise helps ingesting excessive quantity historic knowledge utilizing the CreateBulkImportJob API operation emigrate telemetry knowledge from the growth account
to the staging account
.
Determine 2: Structure emigrate AWS IoT SiteWise telemetry knowledge throughout AWS accounts
AWS IoT SiteWise has knowledge and SQL API operations to retrieve telemetry outcomes. You need to use the export file from the Export AWS IoT SiteWise fashions and belongings by working a bulk export job step to get an inventory of AWS IoT SiteWise asset IDs and property IDs to question utilizing the BatchGetAssetPropertyValueHistory API operation. The next pattern code demonstrates retrieving knowledge for the final two days:
import boto3
import csv
import time
import uuid
"""
Connect with the IoT SiteWise API and outline the belongings and properties
to retrieve knowledge for.
"""
sitewise = boto3.consumer('iotsitewise')
# restrict for less than 10 AssetIds/PropertyIDs/EntryIDs per API name
asset_ids = ['a1','a2','a3']
property_ids = ['b1','b2','b3']
"""
Get the beginning and finish timestamps for the date vary of historic knowledge
to retrieve. At present set to the final 2 days.
"""
# Convert present time to Unix timestamp (seconds since epoch)
end_time = int(time.time())
# Begin date 2 days in the past
start_time = end_time - 2*24*60*60
"""
Generate an inventory of entries to retrieve property worth historical past.
Loops via the asset_ids and property_ids lists, zipping them
collectively to generate a singular entry for every asset-property pair.
Every entry incorporates a UUID for the entryId, the corresponding
assetId and propertyId, and the beginning and finish timestamps for
the date vary of historic knowledge.
"""
entries = []
for asset_id, property_id in zip(asset_ids, property_ids):
entry = {
'entryId': str(uuid.uuid4()),
'assetId': asset_id,
'propertyId': property_id,
'startDate': start_time,
'endDate': end_time,
'qualities': [ "GOOD" ],
}
entries.append(entry)
"""
Generate entries dictionary to map entry IDs to the complete entry knowledge
for retrieving property values by entry ID.
"""
entries_dict = {entry['entryId']: entry for entry in entries}
"""
The snippet under retrieves asset property worth historical past from AWS IoT SiteWise utilizing the
`batch_get_asset_property_value_history` API name. The retrieved knowledge is then
processed and written to a CSV file named 'values.csv'.
The script handles pagination through the use of the `nextToken` parameter to fetch
subsequent pages of information. As soon as all knowledge has been retrieved, the script
exits the loop and closes the CSV file.
"""
token = None
with open('values.csv', 'w') as f:
author = csv.author(f)
whereas True:
"""
Make API name, passing entries and token if on subsequent name.
"""
if not token:
property_history = sitewise.batch_get_asset_property_value_history(
entries=entries
)
else:
property_history = sitewise.batch_get_asset_property_value_history(
entries=entries,
nextToken=token
)
"""
Course of success entries, extracting values into an inventory of dicts.
"""
for entry in property_history['successEntries']:
entry_id = entry['entryId']
asset_id = entries_dict[entry_id]['assetId']
property_id = entries_dict[entry_id]['propertyId']
for history_values in entry['assetPropertyValueHistory']:
value_dict = history_values.get('worth')
values_dict = {
'ASSET_ID': asset_id,
'PROPERTY_ID': property_id,
'DATA_TYPE': str(record(value_dict.keys())[0]).higher().substitute("VALUE", ""),
'TIMESTAMP_SECONDS': history_values['timestamp']['timeInSeconds'],
'TIMESTAMP_NANO_OFFSET': history_values['timestamp']['offsetInNanos'],
'QUALITY': 'GOOD',
'VALUE': value_dict[list(value_dict.keys())[0]],
}
author.writerow(record(values_dict.values()))
"""
Examine for subsequent token and break when pagination is full.
"""
if 'nextToken' in property_history:
token = property_history['nextToken']
else:
break
Use the values.csv
file to import knowledge into AWS IoT SiteWise utilizing the CreateBulkImportJob API operation. Outline the next parameters whilst you create an import job utilizing CreateBulkImportJob
. For a code pattern, see CreateBulkImportJob within the AWS documentation.
adaptive-ingestion-flag
with true
or false
. For this train, set the worth to true. true
, the majority import job does the next: false
, the majority import job ingests historic knowledge into AWS IoT SiteWise.delete-files-after-import-flag
with true
to delete the info from the Amazon S3 knowledge bucket after ingesting into AWS IoT SiteWise heat tier storage. For extra data, see Create a bulk import job (AWS CLI).After you validate the leads to the staging account
, you may delete the info from the growth account
utilizing AWS IoT SiteWise DeleteAsset and DeleteAssetModel API operations. Alternatively, you might proceed to make use of the growth account
to proceed different growth and testing actions with the historic knowledge.
On this weblog put up, we addressed the problem industrial clients face when scaling their AWS IoT SiteWise deployments. We mentioned transferring from PoV to manufacturing throughout a number of vegetation and manufacturing traces and the way AWS IoT SiteWise addresses these challenges. Migrating metadata (similar to asset fashions, asset/enterprise hierarchies, and historic telemetry knowledge) between AWS accounts ensures constant knowledge context. It additionally helps selling Industrial IoT belongings and knowledge via the event lifecycle. For extra particulars please see Bulk operations with belongings and fashions.
👇Comply with extra 👇
👉 bdphone.com
👉 ultraactivation.com
👉 trainingreferral.com
👉 shaplafood.com
👉 bangladeshi.assist
👉 www.forexdhaka.com
👉 uncommunication.com
👉 ultra-sim.com
👉 forexdhaka.com
👉 ultrafxfund.com
👉 ultractivation.com
👉 bdphoneonline.com
POCO continues to make one of the best funds telephones, and the producer is doing…
- Commercial - Designed for players and creators alike, the ROG Astral sequence combines excellent…
Good garments, also referred to as e-textiles or wearable expertise, are clothes embedded with sensors,…
Completely satisfied Halloween! Have fun with us be studying about a number of spooky science…
Digital potentiometers (“Dpots”) are a various and helpful class of digital/analog elements with as much…
Keysight Applied sciences pronounces the enlargement of its Novus portfolio with the Novus mini automotive,…