Kueri, a natural language database tool - Part Three

_________________________

Kueri installation - Windows 7 PC
_________________________

installed Kueri on a Windows 7 PC. In the on-line Kueri referencethis page shows how to do it. Everything went well and I'll share my experiences here. Kueri also works in a Linux environment; this page has further details.

First, be sure to first install at least Java 1.7 before you start the Kueri installation. Kueri relies on Java for its own system management. Kueri works perfectly with a locally installed MySQL / SQL Server / etc. database, but it does not need these resources for its own operation. Because the install package downloads in a tar.gz archive file, the suggested 7-Zip (a free download) really helped. The Windows environment instructions worked well and I quickly installed Kueri SDK Enterprise. Now, Kueri will not show in the Windows Control Panel. If you need to uninstall Kueri for some reason, first drill down to the folder / directory location where you installed it and delete it. Next, drill down to

     C:\Users\{user name}

as in this example


1. Look for the .kueri folder / directory in C:\Users\{user name}
and delete the

     .kueri

folder.

Kueri will work perfectly with a MySQL database stored anywherelocally or in the cloud somewhere. For MySQL, however, Kueri will need a MySQL Java connector, available here from the MySQL website. Copy the connector JAR file, named

     mysql-connector-java-5.1.39-bin

here, from the package, and place it in the

     kueri\tomcat\jdbc

directory as shown in this


2. Place the MySQL Java Connector in the
kueri\tomcat\jdbc
directory
example. 

To run Kueri, first launch the

     startup.bat

file as step 5 explains. To make things easier, I created a desktop shortcut out to this batch file and in its Advanced . . . (AKA properties)

3. Shortcut Advanced properties

I picked

4. Run the batch file as an administrator
"Run as administrator". When startup.bat launches, a DOS command prompt (DOS box) will open; at first, "a lot" of "stuff" will happen inside that box but soon enough, all that will ease off and it will look something roughly like

5. Running startup.bat
this. Keep this box open every moment Kueri runs. Minimize it if you want, but don't close it.

We're almost there. Point a new browser tab to

     127.0.0.1/admin

and the Kueri admin login

6. Kueri admin login
will open. Use

     admin / admin


as the first email / password for the first login, but that, Kueri will want a new password. After a little more configuration in step 8 here, you can use Kueri in production. Next, I'll set up a Kueri connection to the Google Cloud SQL resource I built in Part Two.
_______________

Part three here showed how to set up Kueri in a Windows environment. Part four will show how to configure a Kueri Data Source.

Kueri, a natural language database tool - Part Four

_________________________

Configure a Kueri Data Source
_________________________

After the Kueri login, I clicked Data Sources at the bottom right and on the Data Sources page, I clicked

1. Set up a New Connection to a data source
New Connection. In the MySQL Workbench

2. Find the Cloud SQL IP address
I copied the Cloud SQL IP address from the kueri_demo connection, and back at the Kueri Admin page, I drilled down to open the New Database connection

3. Build a new database connection
page. I named the new Database Connection "kueri_demo"; for the Host, I used the Cloud SQL resource IP address I copied as shown in Figure 2 above. Note that I could have found the IP address at the Google Cloud SQL console as well. I used the Google Cloud SQL credentials for the user name / password, I tested it, and I clicked Next to open this

4. Choose tables
page. Although this page shows a "test_demo_table", the create database script in the GitHub repository file will not create this table and this article will not use this table in any way. It shows on this page only for my own testing and development work. Click Next to open this

5. Initial configuration - Data Source tables
page for initial configuration of the chosen tables. To get the most out of Kueri, table configuration matters the most. The folks at Kueri themselves publish great documentation here and especially here - more than enough to get going. In the GitHub repository, I included the latest schema, or database configuration, for the kueri_demo datasource. Of course, Kueri allows and encourages further datasource data table configuration after the initial work phase. Drill down

6. Configure existing tables
to

     Configure -> Edit

tables to open the table configuration page, and proceed.

A configured Kueri Data Source becomes a valuable resource, and resource owners need a way to backup and restore it. For Kueri, this means "Export / Import a Semantic Map". To do this, drill into the Data Source options

7. Export/Import Semantic Map
of the relevant data source (kueri_demo in this case) and click Export/Import Semantic Map. This dialog box

8. Export / Import dialog box
will open. In the GitHub repository, file

     KUERI_ARTICLE_RESOURCES.zip

 has an exported backup file called

     kueri_demo_30.1

for the kueri_demo Data Source described in this article. To use it, build a basic bare-bones Kueri Data Source called "kueri_demo", and then import kueri_demo_30.1 with the feature described here.

I found that I could handle the table configuration more easily with the Workspace open in one browser tab and the Table Configuration page open in another tab.

Note that web developers can embed Kueri in the web software they build. Although I have not yet explored that feature, it clearly has even more potential.

As I worked with Kueri to configure the

     kueri_demo

Data Source, I realized that the more effort one puts into a Kueri configuration, the more value Kueri will return. The

     kueri_demo_30.1

file reflects my latest configuration work, and because this involved experimentation, a description of this work does not fit well in a review article. I admit: I am still learning how to configure - and more importantly optimize - Data Source tables in and through Kueri. I believe that like so many other products, Kueri will start as a skill set, then grow into a specialty, and finally become an industry. Right now, I am at the Kueri skill set building stage.
_______________

As we've seen, Kueri integrates natural language with relational database resources. It solves a difficult problem and it solves that problem well. Although it has a large configuration space, anyone with a basic understanding of that space can use it. More understanding and skill means more value returned. Kueri has unlimited potential.

Kueri, a natural language database tool - Part Two

_________________________

The Google Cloud SQL database for this article
_________________________

For this article, Kueri points to a Google Cloud SQL database called

     kueri_demo

as the featured demo data source. This database has two tables. Starting from this U.S. Census Bureau page, the Complete Zip Code Totals file (zbp11totals) became the source for the

     kueri_demo.zbp11totals

table. At this IRS page, the 13zpallnoagi file (the second file in the third ZIP Code Data block) became the source for the second

     kueri_demo.taxstat

table, which I will explain here.

For the 13zpallnoagi file import, I first scrapped all columns starting at column "AD" (column 30). Next, I deleted the

     STATEFIPS (column "A" <=> column 1)

and the

     AGI_STUB (column "D" <=> column 4)

columns. This left 27 columns. This file (the file in the second Documentation block) at the IRS page maps the 13zpallnoagi file layout, and I renamed most of the remaining columns


ORIGINAL COLUMN NAME    NEW COLUMN NAME
STATE<>    STATE
ZIPCODE<>    ZIP
N1<>    OVERALL_RETURN_COUNT
MARS1<>    SINGLE_RETURN_COUNT
MARS2<>    JOINT_RETURN_COUNT
MARS4<>    HOH_RETURN_COUNT
PREP<>    PAID_PREP_RETURN_COUNT
N2<>    EXMPTN_COUNT
NUMDEP<>    DEPNDT_COUNT
A00100<>    AGI
N02650<>    RTN_COUNT_WITH_TOTAL_INCOME
A02650<>    TOTAL_INC_AMT
N00200<>    RTN_COUNT_WITH_SAL_AND_WAGES
A00200<>    SAL_AND_WAGE_AMT
N00300<>    RTN_COUNT_WITH_TXBL_INT
A00300<>    TXBL_INT_AMT
N00600<>    RTN_COUNT_WITH_ORDY_DVDS
A00600<>    ORDY_DVD_AMT
N00650<>    RTN_COUNT_WITH_QLFD_DVDS
A00650<>    QLFD_DVD_AMT
N00700<>    RTN_COUNT_WITH_ST_LCL_RFDS
A00700<>    ST_LCL_RFD_AMT

to make things easier.

Next, I set up a Google Cloud SQL (AKA MySQL hosted by Google) resource. Google Cloud Platform offers a free trial product - $300.00 of resources for sixty days, and I used it here. A Cloud SQL resource requires a locally installed MySQL Workbench 6.3, and using steps I described in an earlier Bit Vectors article, I built a connection between the MySQL Workbench and Cloud SQL. At that point, I began to build the kueri_demo database. After I built this database, I rolled it up in a complete database creation script, which I placed in a 3.3‑meg ZIP file named

     KUERI_ARTICLE_RESOURCES

in the GitHub repository. This file will unzip into a 61‑meg MySQL script file; pick the MySQL workbench connection to the Cloud SQL resource, drop the script file text into the MySQL Workbench tab, and run it. With one click, the script should build the kueri_demo database. Note that each table has a ZIP column - each five characters long - but the tables do not have a formal

     parent -> child

relation between them.
_______________

Part two here showed how to set up a Google Cloud SQL demo database which Kueri will front-end. Part three will describe how to install and configure the Kueri product.

Kueri, a natural language database tool - Part One

_________________________

The Kueri user experience
_________________________

Front-end application software sits between the users and the database layer. That front-end software evolved from text-based applications, moving to point and click GUI technologies that query the database and show the result set(s). Users can certainly input text through GUI text controls, but the words have no natural language structure. A natural language technology, where a user can type query sentences with standard grammar, is the next step. For many years, this seemed like science fiction, but Kueri - a natural language interface for databases - has arrived.

Kueri works as a front-end layer for many SQL Server compatible databases and common format spreadsheets. After installation and configuration, the user types a natural language query and Kueri returns a result set. Developers can even embed Kueri in the web and mobile software they build.

In this four-part article, I'll review Kueri and describe my experiences with it. As an early reviewer, the folks at Kueri provided me with the Kueri SDK Enterprise edition, which they will release shortly. The Kueri Lite (Beta) SDK is available now. I built the review around a demo Kueri SDK Enterprise installation that assumes

  • the Kueri SDK Enterprise product
  • a Windows 7 Intel PC
  • a Google Cloud SQL database

as prerequisitesDownload the Google Cloud SQL demo database creation script and the demo Kueri configuration file at this GitHub repositoryNote that a Google Cloud SQL database is a MySQL database, so the database creation script should work in any MySQL environment with little or no modification. 

Part one here will describe the Kueri user experience. Part two will describe the Google Cloud database that Kueri will front-end for this article. Part three will show how to install Kueri. Part four will explain how to configure a Kueri Data Source.
____________________

We'll first look at the Kueri user experience. In the main Kueri page


1. Open the Kueri Workspace
click "Workspace" to switch into the Workspace mode. Then, type a natural language query in the textbox

2. Try a query!
and hit "Enter". Kueri shows the result set. At the right, click "Switch to Guided mode" and below the textbox, Kueri will offer suggestions as the user types the query. Note that Guided mode is more restrictive as the user types the query. Click "Show the SQL" at the right to see the actual MySQL query

3. Kueri built this MySQL query
that Kueri built.
_______________

Part one here described the Kueri user experience. Part two will describe the Google Cloud database that Kueri will front-end for this article.

Front-end Google Cloud products with a Google Apps Script application - Part 4

_________________________

This article improves the Google Apps Script engineering featured in earlier Bit Vectors articles herehere, and here to add file save and print features to a Google Apps Script application. This article will also discuss the potential of Google Apps Script application sharing.

Part one showed the sample application user experience.

Part two described the application configuration.

Part three here explained the application engineering.

Part four here will explore how Google Apps Script application sharing presently works and will explain why this is not yet ready for production, based on the present state of the Google Apps Script product. It will also discuss suggested feature enhancements to the Google Apps Script product that would make application sharing possible.

Download the three application component files from the GitHub repository.
____________________

In parts one through three, we saw how the application works and the engineering behind it. All this assumed that one Google account owned and used everything - the cloud data resources, the Google Drive assets, the web application, etc. At the single-account level, everything works well. Eventually, though, we might want to share the application with other Google accounts, so that the owners of those accounts can use it themselves, each with reliable security and privacy. Google Apps Script actually does support application sharing. Unfortunately, this sharing has security, privacy, and performance drawbacks and because of these flaws, Google Apps Script won't work. Ideally, the Google Apps Script product will evolve to make this possible. Here in part four, we'll look at the situation more closely.

In the Google Apps Script editor, click the "cloud" Deploy as Web App...

1. Set app access to anyone.
icon to open the "Deploy as web app" dialog box. The lower "Who has access to the app;" dropdown defaults to "Only myself". Change this to "Anyone", and any actively logged in Google account can use the application. To do so, copy the URL in the "Current web app URL:" textbox and send it out. Those people can drop that URL in a browser and proceed. From their perspectives, they will have the same user experience described in part one. Unfortunately, this has huge problems.

First, the application owner has no way to restrict access to a list of guest users, by Google account ID. Only the app owner can use it, or anyone can use it. In Figure 1, the  Share  icon at the top right controls access to the application code. It does not control access to the executable. The Google Apps Script product should offer granular share control of the executable.

Second, although a guest can certainly save a result set on the Google Drive of that guest, the app will build the scratchpad spreadsheets it needs on the Google Drive of the application owner. This creates many problems. First, if multiple guest users use the application at the same time, they would collide because they would use the same scratchpad spreadsheet. The

     returnScratchpadFileCollection()

function in utilities.gs names the scratchpad files in part with a date / time stamp. An ideal solution would name these files with the ID of the Google account, owner or guest(s), running the app. For each individual user, the email address of the account would become a unique value. Because the scratchpad files exist in the Google Drive of the application owner, the application could not see these email addresses and the date / time stamp became necessary. Additionally, in an ideal solution the scratchpad file(s) mapped to a guest would ideally exist in the Google Drive of that guest - not that of the application owner. This would guarantee privacy for the guest users. As I tried to solve this problem, I built the now commented code at the top of the

     returnScratchpadFileCollection()

function. Unfortunately, I did not succeed, but I decided to include this code in the file.

Third, a Google Apps Script application has a six minute execution time limit. Google Apps Script offers a sleep function, but this has a time limit itself. An ideal solution would more flexibly pause, or sleep, the front-end application while the data layer proceeds. Then, when the data layer returns the result set, the front-end application would wake up. This way, a Google Apps Script application could reliably front-end a BigQuery query that lasts longer than six minutes.

Through the Apps Marketplace, Google does offer an app distribution option to a restricted domain that might solve some of the identified distribution issues. However, this approach involves a lot of configuration and overhead, beyond the Google Apps Script environment.
_______________

Part four here described the drawbacks of shared execution of a Google Apps Script application, and the product enhancements that would help solve those drawbacks. Hopefully, Google will make these enhancements. When they do, Google Cloud will become the dominant cloud space product line
.

Front-end Google Cloud products with a Google Apps Script application - Part 3

_________________________

This article improves the Google Apps Script engineering featured in earlier Bit Vectors articles herehere, and here to add file save and print features to a Google Apps Script application. This article will also discuss the potential of Google Apps Script application sharing.

Part one showed the sample application user experience.

Part two described the application configuration.

Part three here will explain the application engineering.

Part four will explore how Google Apps Script application sharing presently works and will explain why this is not yet ready for production, based on the present state of the Google Apps Script product. It will also discuss suggested feature enhancements to the Google Apps Script product that would make application sharing possible.

Download the three application component files from the GitHub repository.
____________________

Three files comprise the demo application. BigQueryDemoApp.html more or less clones the BigQueryDemoApp file of the earlier Bit Vectors article, with some small differences. First, the form RESET button calls a revised #RESET jQuery function


1. The form RESET button calls this #RESET jQuery function
that only resets the web form controls. The original version called a Google Apps Script function that also reset the spreadsheet tied to the original application. Second, this file places slightly different controls with slightly different form behavior on the web page itself.

The Code.gs file has three functions. When the application first loads, it runs the

2. The doGet() function
doGet() function. The doGet() function first calls the

     returnScratchpadFileCollection()

located in the utilities.gs file to build a "scratchpad" Google sheets. The Google Drive of the active Google account owns this file. That called function returns the file ID of the created file, and the next line in doGet() saves that value as a script property. We'll look at the

     returnScratchpadFileCollection()

in more detail shortly. Finally, doGet() calls the

     createHtmlOutputFromFile()

function, which points to the BigQueryDemoApp.html file. This called function loads the application web page; the createHtmlOutputFromFile() function helps enable potential application sharing. We'll look at potential application sharing more closely in Part Four.

To run a query, the user makes zero or more picks from the web form controls and clicks the  SUBMIT  button. Just like the application described in this Bit Vectors article, the onclick() of this control calls the BigQueryDemoApp.html file

     submitParams()

function. This function gathers the picked form control values into both an array and a separate BigQuery query string


3. The last line of submitParams() calls
returnFormParams() in Code.gs
passing these parameters to the Code.gs returnFormParams() function. In Code.gs

4. The returnFormParams() function
in the Code.gs file
the called returnFormParams() function first retrieves the "scratchpad" Google sheet file created when the application loaded. The function places the web form parameter values, passed in the arrayParam parameter, on the sheet, and then calls the Code.gs function runQuery(), passing the assembled BigQuery query string with the queryString string parameter. The runQuery() function queries the BigQuery datasource, ultimately returning the result set back to the writeWebForm() function located in the HTML file. The writeWebForm() function writes the result set values on the web form and enables all the web form buttons. This feature space matches the application described in the earlier Bit Vectors article fairly closely.

At this point, the jQuery in the HTML file enables the web form SAVE button. The SAVE button calls the savePrintSheet() function

5. The savePrintsheet() function
in the HTML file. Because the finished application provides the quantiles function, the savePrintSheet() function has to handle a result set of variable size. The savePrintSheet() function calculates the size as the firstBlankRow variable and passes this value to the funcSaveSheet() function at line 72 in the Code.gs file google.script.run statement.

Because of the jQuery enable control structure, the user can click  SAVE / PRINT  only after the application returns a result set after a run. At this point, when the user clicks  SAVE / PRINT , the funcSaveSheet() function


6. The funcSaveSheet() function - first part
will have the raw result set available on the scratchpad Google sheet file. The function first calls returnScratchpadFile() out in

7. The returnScratchpadFile() function
utilities.gs to open the scratchpad spreadsheet, returning its ID value. Next, funcSaveSheet() calls formatScratchpadSpreadsheet() in utilities.gs to format the sheet. Although the returnFormParams() function could have called formatScratchpadSpreadsheet(), that would become a wasted call if the user does not save or print the result set. Therefore, the call happens here.

The funcSaveSheet() function places the first sheet of the scratchpad spreadsheet into a variable, and then places the URL of that spreadsheet into another variable. A Google Sheet file defaults to one thousand rows and because a result set in this software could have as few as six rows, funcSaveSheet() uses the firstBlankRow parameter at line 209 to hide spreadsheet rows starting from firstBlankRow


8. The funcSaveSheet() function - second part
down to row one thousand. This matters because otherwise, a finished file with six rows would have a lot of blank, wasted space. String variable url_ext has the parameters needed to export the result set as a finished PDF; the response variable holds the content of the sheet formatted with those url_text parameters. Line 230 returns the response variable content, transformed into a finished PDF file through the functions in the statement.

After the call to funcSaveSheet(), the Code.gs file google.script.run statement at line 72 then calls the saveSheetWithCloudPrint() function


9. The saveSheetWithCloudPrint() function
located back in the HTML file. This function uses the cloudprint.Gadget object to build a new Cloud Print dialog box


10. Save or print the finished result set PDF
with the cloudprint.Gadget object. The Gadget object receives the finishedDoc PDF file as the finishedDoc parameter in the line 88 call.


Google does offer the Drive Picker. In this application, the Drive Picker would have become the natural choice for the file save feature, except for one problem. I never found a way to pass a finished PDF file to a Drive Picker object as the file content to save. This has most likely occurred to Google and they will eventually offer it as a feature. For now, this became "a bridge too far". Of course, while the cloudprint.Gadget object shown here can save a file, it will only save files in the root directory of a Google Drive. However, the overall flexibility of this approach outweighs this flaw. In the future, Google will probably enable directory drill-downs in the cloudprint.Gadget object as well.
_______________

Part three here described the application engineering. Part four 
will explore how Google Apps Script application sharing could work for this application, and why we should avoid it at this time.

Front-end Google Cloud products with a Google Apps Script application - Part 2

_________________________

This article improves the Google Apps Script engineering featured in earlier Bit Vectors articles herehere, and here to add file save and print features to a Google Apps Script application. This article will also discuss the potential of Google Apps Script application sharing.

Part one showed the sample application user experience.

Part two here will describe the application configuration.

Part three will explain the application engineering.

Part four will explore how Google Apps Script application sharing presently works and will explain why this is not yet ready for production, based on the present state of the Google Apps Script product. It will also discuss suggested feature enhancements to the Google Apps Script product that would make application sharing possible.

Download the three application component files from the GitHub repository.
____________________

Before starting on the application front-end, first build the associated BigQuery project as described here. Although parts of that process - the dialog boxes, steps, etc. - have changed since the original article publication, the core ideas remain. One new step involves Drive API configuration. At the Developer's Console, enable the BigQuery and Drive APIs. The process involves the same steps for both APIs; here, we'll focus on the Drive API. First search for the Drive API


1. Search for the Drive API for the project
and in this

2. Enable the Drive API
panel, click  Enable . Wait a few moments, and the API Manager will enable the Drive API

3. The Drive API is enabled
for the project. Do the same for the BigQuery API. Next, we'll need the Project ID. In the Cloud Console



4. The project ID value
look for the project ID value, circled and hidden here in Figure 4. In Google Drive


5. Build a new Google Apps Script application
build a new Google Apps Script application. At this point, the dropdown might not show the Google Apps Script pick. If this happens, click Connect more apps and in the dialog that opens, look for and add "Apps Script", and proceed.

When the Google Apps Script editor opens, name the project at the upper left. Drill down to

     File - > Project properties

6. Pick Project properties in the dropdown
to open this


7. Script properties: add the project Id
box, adding the Project ID value from figure four above.

Finally, in the script editor, drill down to

     Resources -> Advanced Google Services

to open the Advanced Google Services

8. Advanced Google Services
picker. Turn the BigQuery and Drive API's  on .

When the application first runs, its configuration will a few more steps. We'll look at them now.

When a user first deploys the application, this dialog


9. Application first run: authorization required
will open. Click "Continue" to open an authorization dialog

10. The authorization dialog
that shows an  Allow  at the bottom. Click that  Allow  to proceed.

Next, the application will probably want additional authorizations to use the Drive and BigQuery APIs. It might seem that the earlier steps would cover this, but they might not have. Google Apps Script can show the messages asking for these authorizations in two ways, and we'll cover both of them now.

When the application runs at this point, it could show an error message like


11. Message: enable the Drive API
this. For the Drive API, the second sentence of this error message has a URL pointing to this


12. Enable the Drive API
page. Enable the API and run the application again. If the web application form looks okay but nothing happens when you click  SUBMIT , switch back to the script editor. Drill down to

     View -> Execution transcript

and scroll down, looking for


13. URL to configure the BigQuery API
a failure message with a URL. Open that URL in a new tab, proceed as instructed, and run the application. Note that for this "extra" API authorization, either of these techniques should work.

The same Google account should "own" both the Drive and the Cloud resources used by the application. In other words, a single Google account should both own the application in and through Google Drive. That same Google account should also reference, or point to, a Cloud resource or resources used by that application. The OAuth2 machinery drives this security behavior.
_______________

Part two here described the application configuration. Part three will explain the application engineering.


Front-end Google Cloud products with a Google Apps Script application - Part 1

_________________________

In an earlier article, I showed how to front-end Google BigQuery with a Google Docs Spreadsheet / Google Apps Script application. Later articles generalized the engineering: this article described a Google Cloud SQL front-end application and this article explained how to integrate BigQuery and Cloud SQL to build BigQuery stored procedure functionality. This engineering works well enough, but it has some drawbacks:

First, it offers no way to save a result set as a finished file.

Second, it offers no way to print a result set.

Third, the sample applications rely on a container‑based architecture - in other words, the application script files are bound to a Google Sheets file as components of that file.

This article describes a sample Google Apps Script application that front-ends a BigQuery project. However, the application also offers file save and print features. Additionally, while its engineering relies on Google Sheets, it breaks the Google Sheets dependency of the earlier article samples.

Lastly, this article will explore application sharing - the idea that the application can serve more than one user. We'll see that although the engineering described in this article is not recommended for that use case, the engineering can become the basis for future shared production applications if Google Apps Script technology evolves as it ideally will.


Part one here will show the sample application user experience.

Part two will describe the application configuration.

Part three will explain the application engineering.

Part four will explore how Google Apps Script application sharing presently works and will explain why this is not yet ready for production, based on the present state of the Google Apps Script product. It will also discuss suggested feature enhancements to the Google Apps Script product that would make application sharing possible.

Download the three application component files from the GitHub repository.
____________________

The application described in this article works a lot like the application described in this earlier Bit Vectors article. It front-ends the "Complete ZIP Code Totals Filefrom the U.S. Census Bureau, placed in one table in a Google BigQuery project. The user starts with a URL pointing to the application to open this


1. The BigQuery application web form
form. The form disables the SAVE / PRINT button at this point. The user picks zero or more values in the dropdowns and clicks SUBMIT. When the application shows the result set


2. The application result set
the SAVE / PRINT button enables. The user can click the RESET button to reset the form back to its original state, or (s)he can click the SAVE / PRINT button to open the


3. The Print / Save dialog
print / save dialog. With this dialog, the user can print a spreadsheet-formatted result set through the Google Cloud Print! link. This will open dialog boxes to print the result set through a printer. If the active Google account running the application has access rights to one or more printers configured earlier through Google Cloud Print!, printing can happen through it / them. This dialog can also configure a new printer. The user can also click

     Save to Google Drive

and then

     Advanced options...

to open this second
4. Save the output in the Google Drive of the user
dialog. The user types a file name in the

     Google Drive File Name

textbox and clicks  Print . The application saves the spreadsheet equivalent of the form - with the result set and all the control picks - as a PDF placed in the Google Drive


5. The output file saved in Google Drive
"owned" by the active Google account. The application saved the file - named "Application Output" here - as a conventional PDF

6. The Application Output PDF file
and the user can copy / rename / share / etc. the file. The Cloud Print dialog above defaults the file save to the root directory. It does not handle directory drill-downs. Google explains that the Cloud Print dialog is in beta; hopefully, it will soon offer directory drill-down flexibility.
_______________
Part one showed the application user experience. Part two will describe the application configuration.