Practice Free Professional Data Engineer Exam Online Questions
You have a job that you want to cancel. It is a streaming pipeline, and you want to ensure that any data that is in-flight is processed and written to the output.
Which of the following commands can you use on the Dataflow monitoring console to stop the pipeline job?
- A . Cancel
- B . Drain
- C . Stop
- D . Finish
B
Explanation:
Using the Drain option to stop your job tells the Dataflow service to finish your job in its current state. Your job will immediately stop ingesting new data from input sources, but the Dataflow service will preserve any existing resources (such as worker instances) to finish processing and writing any buffered data in your pipeline.
Reference: https://cloud.google.com/dataflow/pipelines/stopping-a-pipeline
You need to migrate a 2TB relational database to Google Cloud Platform. You do not have the resources to significantly refactor the application that uses this database and cost to operate is of primary concern.
Which service do you select for storing and serving your data?
- A . Cloud Spanner
- B . Cloud Bigtable
- C . Cloud Firestore
- D . Cloud SQL
Topic 6, Main Questions Set C
You are training a spam classifier. You notice that you are overfitting the training data.
Which three actions can you take to resolve this problem? (Choose three.)
- A . Get more training examples
- B . Reduce the number of training examples
- C . Use a smaller set of features
- D . Use a larger set of features
- E . Increase the regularization parameters
- F . Decrease the regularization parameters
Which of these operations can you perform from the BigQuery Web UI?
- A . Upload a file in SQL format.
- B . Load data with nested and repeated fields.
- C . Upload a 20 MB file.
- D . Upload multiple files using a wildcard.
B
Explanation:
You can load data with nested and repeated fields using the Web UI.
You cannot use the Web UI to:
– Upload a file greater than 10 MB in size
– Upload multiple files at the same time
– Upload a file in SQL format
All three of the above operations can be performed using the "bq" command.
Reference: https://cloud.google.com/bigquery/loading-data