Qlik Data Catalyst February 2020 update


Hi, I’m Chris Ortega And today I’m going to show you The integration of Qlik Data Catalyst
with Qlik Cloud Services And there are two ways of moving data from on-prem to Qlik Cloud Services The first that we’ll go through is to move a
QVD to your personal data files By shopping the catalog and doing a one-click transfer of that QVD up to QCS And then for any data file whether it’s a QVD or an entity Within the catalog what we will do is use the publish
module to move that information as a QD Regardless of its source type up to Qlik Cloud Services leveraging the publish module in a QVD format So, what I’ve got is a running instance of Qlik Data catalyst in this tab and In this tab I’ve connected to my tenant On Qlik Cloud Services What I’m going to do now is move data To Qlik Cloud Services from Data Catalyst And in this case I’m going to take a QVD and add it to my data files on Qlik Cloud Services So, the first thing I’ll do is search for my QVD’s And now I’ve got a list of the available QVD’s in the catalog And if I go to Sense just to show you on Cloud Services I’ll go to my app And I’ll go to the add data tab And you’ll notice I have my data files here,
I’ve also established an S3 connection To a bucket but we’re gonna be talking about the my data files first And you’ll see I have nothing here So what I’m going to do Is select one of these in this case I’ll take bed occupancy Add it to my cart And I’ll take action and you’ll notice
at the bottom of this drop down now I have the option to publish to my QCS data files This will work with my QVD and push that up to Qlik Cloud Services What you’ll notice is if I go back in here I now have bed occupancy QVD here And can begin to work with that Now, the way that this works Is that within Data Catalyst what I’ve done
is I’ve mapped My user here by looking at my profile To my account on Qlik Cloud services
by generating a QCS user token On Qlik Cloud services I’ve copied and pasted that token into here So what we’re doing
is we’re leveraging the kicks data file service in order to move this data So, now what we’re going to do is we’re going to move either QVD or more importantly non QVD dataset to QCS Using the publish module And what that
will allow us to do besides reconcile The native data format qvd or non qvd But it will also allow us to schedule these jobs So the first thing I’ll do is
I’ll show you in QDC that we have a target here that we have set up Called QCS training bucket Which references Qlik Cloud Services Which is configured the tenant and all of that is configured within the query and the properties file But what we’ve done is we’ve granted access to certain S3 buckets here And assigned them to groups so if you don’t have access to the bucket Based upon your QDC group and these should align with QCS as well You won’t have the ability to write to those buckets So we have a bucket here QDC training that we’re going to write to And if I close this What I’ll also show you is on the QCS side That I’ve added an S3 connection to that bucket So I’m in the add data page And you’ll see I’ve got the contents here And QDC generated objects Will show up here with a prefix of QDC So you’ll see I have no QDC
objects there So what I’m going to do is go back to my catalogue And what I’ve done is I have searched for a particular source in this case TPCH Where I have a customer file with 1,500 rows and I’ve got all this great catalog profile information What I’m going to do is, I’m going to add that to my cart And now when I take action rather than
publishing to my QCS data files I’m going to actually use the publish
framework And what I’ll do is set up a data set name And for target here I’m going to choose
that QCS training bucket When I click next, I am asked for a couple of things Which data do I want to send over Do I want to send all loads of
the data the latest load specific ones I’m going to choose to send the latest
load And now based on those permissions I have access to only one of those two buckets
the QDC training bucket So I’m going to choose that I click Next I schedule this I can set this up as a recurring schedule Or a one-time immediate which I’m going to do a one-time immediate in this case And when I click Next And brought to this screen I have some fields here That are marked as sensitive So those will be obfuscated When written out And when I click Next
and actually save this It’s going to now run this job it’s created the job and if
I look at the log Details I can see that the job is running And if I pull this for updates We can see what it’s actually doing So right now what it’s done is it’s taken that data And generated a QVD file for it And as well
it’s generated sample data as well as The profile data for transfer to QCS
and so what we’ll see is now that that Job has finished what we can do is
toggle over to QCS And if I click I’ll click here And what I’ll see here Is that I now have in this S3 bucket I have something that is prefixed by QDC The source is TPCHSF ‘001’
and the entity is customer And underneath that I have my QVD Which is the customer data But I’ve also got the profiling data So, I’ve got all that rich
profiling data I’ve got sample data And I’ve got that customer data in a QVD format So in this way, I’m able to work with this information So between the two mechanisms Either moving something to
your own data files in a one-click fashion Taking a QVD in the catalog and getting it up to your data files you can do that if you have access to an s3
bucket you can take any data and make it Available in QCS as a QVD Again this was Chris Ortega the product manager for Qlik Data Catalyst I hope you enjoyed this brief demo, thank you!

Leave a Reply

Your email address will not be published. Required fields are marked *