Created
November 16, 2017 09:19
-
-
Save hden/287dd0ec840304f9d4590cf00e530ef4 to your computer and use it in GitHub Desktop.
metabase
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
$ docker run --rm -p 3000:3000 metabase/metabase | |
11-16 09:07:49 INFO metabase.util :: Loading Metabase... | |
11-16 09:07:58 INFO util.encryption :: DB details encryption is DISABLED for this Metabase instance. 🔓 | |
See http://www.metabase.com/docs/latest/operations-guide/start.html#encrypting-your-database-connection-details-at-rest for more information. | |
11-16 09:08:09 INFO metabase.core :: Starting Metabase in STANDALONE mode | |
11-16 09:08:09 INFO metabase.core :: Launching Embedded Jetty Webserver with config: | |
{:port 3000, :host "0.0.0.0"} | |
11-16 09:08:09 INFO metabase.core :: Starting Metabase version v0.26.2 (3b65f11 release-0.26.2) ... | |
11-16 09:08:09 INFO metabase.core :: System timezone is 'GMT' ... | |
11-16 09:08:10 WARN metabase.driver :: No -init-driver function found for 'metabase.driver.google' | |
11-16 09:08:10 INFO metabase.core :: Setting up and migrating Metabase DB. Please sit tight, this may take a minute... | |
11-16 09:08:10 INFO metabase.db :: Verifying h2 Database Connection ... | |
11-16 09:08:11 INFO metabase.db :: Verify Database Connection ... ✅ | |
11-16 09:08:11 INFO metabase.db :: Running Database Migrations... | |
11-16 09:08:11 INFO metabase.db :: Setting up Liquibase... | |
11-16 09:08:11 INFO metabase.db :: Liquibase is ready. | |
11-16 09:08:11 INFO metabase.db :: Checking if Database has unrun migrations... | |
11-16 09:08:15 INFO metabase.db :: Database has unrun migrations. Waiting for migration lock to be cleared... | |
11-16 09:08:15 INFO metabase.db :: Migration lock is cleared. Running migrations... | |
11-16 09:09:33 INFO metabase.db :: Database Migrations Current ... ✅ | |
com.mchange.v2.cfg.DelayedLogItem [ level -> FINE, text -> "The configuration file for resource identifier 'hocon:/reference,/application,/c3p0,/' could not be found. Skipping.", exception -> null] | |
11-16 09:09:34 INFO db.migrations :: Running all necessary data migrations, this may take a minute. | |
11-16 09:09:34 INFO db.migrations :: Running data migration 'set-card-database-and-table-ids'... | |
11-16 09:09:34 INFO db.migrations :: Running data migration 'set-mongodb-databases-ssl-false'... | |
11-16 09:09:34 INFO db.migrations :: Running data migration 'set-default-schemas'... | |
11-16 09:09:34 INFO db.migrations :: Running data migration 'set-admin-email'... | |
11-16 09:09:34 INFO db.migrations :: Running data migration 'remove-database-sync-activity-entries'... | |
11-16 09:09:34 INFO db.migrations :: Running data migration 'update-dashboards-to-new-grid'... | |
11-16 09:09:34 INFO db.migrations :: Running data migration 'migrate-field-visibility-type'... | |
11-16 09:09:34 INFO db.migrations :: Running data migration 'add-users-to-default-permissions-groups'... | |
11-16 09:09:34 INFO models.permissions-group :: Created magic permissions group 'All Users' (ID = 1) | |
11-16 09:09:34 INFO models.permissions-group :: Created magic permissions group 'Administrators' (ID = 2) | |
11-16 09:09:34 INFO db.migrations :: Running data migration 'add-admin-group-root-entry'... | |
11-16 09:09:34 INFO db.migrations :: Running data migration 'add-databases-to-magic-permissions-groups'... | |
11-16 09:09:34 INFO models.permissions-group :: Created magic permissions group 'MetaBot' (ID = 3) | |
11-16 09:09:34 INFO db.migrations :: Running data migration 'migrate-field-types'... | |
11-16 09:09:34 INFO db.migrations :: Running data migration 'fix-invalid-field-types'... | |
11-16 09:09:34 INFO db.migrations :: Running data migration 'copy-site-url-setting-and-remove-trailing-slashes'... | |
11-16 09:09:34 INFO db.migrations :: Running data migration 'migrate-query-executions'... | |
11-16 09:09:34 INFO db.migrations :: Running data migration 'drop-old-query-execution-table'... | |
11-16 09:09:34 INFO db.migrations :: Running data migration 'ensure-protocol-specified-in-site-url'... | |
11-16 09:09:34 INFO db.migrations :: Finished running data migrations. | |
11-16 09:09:34 INFO metabase.events :: Starting events listener: metabase.events.activity-feed 👂 | |
11-16 09:09:34 INFO metabase.events :: Starting events listener: metabase.events.dependencies 👂 | |
11-16 09:09:34 INFO metabase.events :: Starting events listener: metabase.events.driver-notifications 👂 | |
11-16 09:09:34 INFO metabase.events :: Starting events listener: metabase.events.last-login 👂 | |
11-16 09:09:34 INFO metabase.events :: Starting events listener: metabase.events.metabot-lifecycle 👂 | |
11-16 09:09:35 INFO metabase.events :: Starting events listener: metabase.events.notifications 👂 | |
11-16 09:09:35 INFO metabase.events :: Starting events listener: metabase.events.revision 👂 | |
11-16 09:09:35 INFO metabase.events :: Starting events listener: metabase.events.sync-database 👂 | |
11-16 09:09:35 INFO metabase.events :: Starting events listener: metabase.events.view-log 👂 | |
11-16 09:09:35 INFO metabase.task :: Loading tasks namespace: metabase.task.follow-up-emails 📆 | |
11-16 09:09:35 INFO metabase.task :: Loading tasks namespace: metabase.task.send-anonymous-stats 📆 | |
11-16 09:09:35 INFO metabase.task :: Loading tasks namespace: metabase.task.send-pulses 📆 | |
11-16 09:09:35 INFO metabase.task :: Loading tasks namespace: metabase.task.sync-databases 📆 | |
11-16 09:09:35 INFO metabase.task :: Loading tasks namespace: metabase.task.upgrade-checks 📆 | |
11-16 09:09:35 INFO metabase.core :: Looks like this is a new installation ... preparing setup wizard | |
11-16 09:09:35 INFO metabase.core :: Please use the following url to setup your Metabase installation: | |
http://0.0.0.0:3000/setup/ | |
11-16 09:09:35 INFO metabase.sample-data :: Loading sample dataset... | |
11-16 09:09:35 DEBUG sync.util :: Sync operations in flight: {:sync #{1}} | |
11-16 09:09:35 INFO sync.util :: STARTING: Sync h2 Database 1 'Sample Dataset' | |
11-16 09:09:35 DEBUG sync.util :: Sync operations in flight: {:sync #{1}, :sync-metadata #{1}} | |
11-16 09:09:35 INFO sync.util :: STARTING: Sync metadata for h2 Database 1 'Sample Dataset' | |
11-16 09:09:35 INFO sync-metadata.tables :: Found new tables: (Table 'PUBLIC.PRODUCTS' Table 'PUBLIC.ORDERS' Table 'PUBLIC.PEOPLE' Table 'PUBLIC.REVIEWS') | |
11-16 09:09:36 INFO sync-metadata.fks :: Marking foreign key from Table 2 'PUBLIC.ORDERS' Field 9 'USER_ID' -> Table 3 'PUBLIC.PEOPLE' Field 23 'ID' | |
11-16 09:09:36 INFO sync-metadata.fks :: Marking foreign key from Table 2 'PUBLIC.ORDERS' Field 12 'PRODUCT_ID' -> Table 1 'PUBLIC.PRODUCTS' Field 7 'ID' | |
11-16 09:09:36 INFO sync-metadata.fks :: Marking foreign key from Table 4 'PUBLIC.REVIEWS' Field 32 'PRODUCT_ID' -> Table 1 'PUBLIC.PRODUCTS' Field 7 'ID' | |
11-16 09:09:36 INFO sync.util :: FINISHED: Sync metadata for h2 Database 1 'Sample Dataset' (1 s) | |
11-16 09:09:36 DEBUG sync.util :: Sync operations in flight: {:sync #{1}, :analyze #{1}} | |
11-16 09:09:36 INFO sync.util :: STARTING: Analyze data for h2 Database 1 'Sample Dataset' | |
11-16 09:09:36 INFO middleware.cache :: Using query processor cache backend: :db 💾 | |
11-16 09:09:36 DEBUG analyze.table-row-count :: Set table row count for Table 1 'PUBLIC.PRODUCTS' to 200 | |
11-16 09:09:37 DEBUG analyze.fingerprint :: Saving fingerprint for Field 1 'VENDOR' | |
11-16 09:09:37 DEBUG analyze.fingerprint :: Saving fingerprint for Field 2 'TITLE' | |
11-16 09:09:37 DEBUG analyze.fingerprint :: Saving fingerprint for Field 3 'PRICE' | |
11-16 09:09:37 DEBUG analyze.fingerprint :: Saving fingerprint for Field 4 'CATEGORY' | |
11-16 09:09:37 DEBUG analyze.fingerprint :: Saving fingerprint for Field 5 'RATING' | |
11-16 09:09:37 DEBUG analyze.fingerprint :: Saving fingerprint for Field 6 'EAN' | |
11-16 09:09:37 DEBUG analyze.fingerprint :: Saving fingerprint for Field 7 'ID' | |
11-16 09:09:37 DEBUG analyze.fingerprint :: Saving fingerprint for Field 8 'CREATED_AT' | |
11-16 09:09:37 DEBUG classifiers.category :: Field 1 'VENDOR' has 200 distinct values. Since that is less than 300, we're marking it as a category. | |
11-16 09:09:37 DEBUG analyze.classify :: Based on classification, updating these values of Field 1 'VENDOR': {:special_type :type/Category} | |
11-16 09:09:37 DEBUG classifiers.category :: Field 2 'TITLE' has 200 distinct values. Since that is less than 300, we're marking it as a category. | |
11-16 09:09:37 DEBUG analyze.classify :: Based on classification, updating these values of Field 2 'TITLE': {:special_type :type/Category} | |
11-16 09:09:37 DEBUG classifiers.category :: Field 3 'PRICE' has 199 distinct values. Since that is less than 300, we're marking it as a category. | |
11-16 09:09:37 DEBUG analyze.classify :: Based on classification, updating these values of Field 3 'PRICE': {:special_type :type/Category} | |
11-16 09:09:37 DEBUG classifiers.category :: Field 4 'CATEGORY' has 4 distinct values. Since that is less than 300, we're marking it as a category. | |
11-16 09:09:37 DEBUG analyze.classify :: Based on classification, updating these values of Field 4 'CATEGORY': {:special_type :type/Category} | |
11-16 09:09:37 DEBUG classifiers.category :: Field 5 'RATING' has 25 distinct values. Since that is less than 300, we're marking it as a category. | |
11-16 09:09:37 DEBUG analyze.classify :: Based on classification, updating these values of Field 5 'RATING': {:special_type :type/Category} | |
11-16 09:09:37 DEBUG classifiers.category :: Field 6 'EAN' has 200 distinct values. Since that is less than 300, we're marking it as a category. | |
11-16 09:09:37 DEBUG analyze.classify :: Based on classification, updating these values of Field 6 'EAN': {:special_type :type/Category} | |
11-16 09:09:37 DEBUG classifiers.name :: Based on the name of Field 7 'ID', we're giving it a special type of :type/PK. | |
11-16 09:09:37 INFO sync.analyze :: [************······································] 😒 25% Analyzed Table 1 'PUBLIC.PRODUCTS' | |
11-16 09:09:37 DEBUG analyze.table-row-count :: Set table row count for Table 2 'PUBLIC.ORDERS' to 17624 | |
11-16 09:09:38 DEBUG analyze.fingerprint :: Saving fingerprint for Field 9 'USER_ID' | |
11-16 09:09:38 DEBUG analyze.fingerprint :: Saving fingerprint for Field 10 'SUBTOTAL' | |
11-16 09:09:38 DEBUG analyze.fingerprint :: Saving fingerprint for Field 11 'TAX' | |
11-16 09:09:38 DEBUG analyze.fingerprint :: Saving fingerprint for Field 12 'PRODUCT_ID' | |
11-16 09:09:38 DEBUG analyze.fingerprint :: Saving fingerprint for Field 13 'TOTAL' | |
11-16 09:09:38 DEBUG analyze.fingerprint :: Saving fingerprint for Field 14 'ID' | |
11-16 09:09:38 DEBUG analyze.fingerprint :: Saving fingerprint for Field 15 'CREATED_AT' | |
11-16 09:09:38 DEBUG classifiers.category :: Field 10 'SUBTOTAL' has 199 distinct values. Since that is less than 300, we're marking it as a category. | |
11-16 09:09:38 DEBUG analyze.classify :: Based on classification, updating these values of Field 10 'SUBTOTAL': {:special_type :type/Category} | |
11-16 09:09:38 DEBUG classifiers.name :: Based on the name of Field 14 'ID', we're giving it a special type of :type/PK. | |
11-16 09:09:38 INFO sync.analyze :: [*************************·························] 😬 50% Analyzed Table 2 'PUBLIC.ORDERS' | |
11-16 09:09:38 DEBUG analyze.table-row-count :: Set table row count for Table 3 'PUBLIC.PEOPLE' to 2500 | |
11-16 09:09:39 DEBUG analyze.fingerprint :: Saving fingerprint for Field 16 'CITY' | |
11-16 09:09:39 DEBUG analyze.fingerprint :: Saving fingerprint for Field 17 'BIRTH_DATE' | |
11-16 09:09:40 DEBUG analyze.fingerprint :: Saving fingerprint for Field 18 'SOURCE' | |
11-16 09:09:40 DEBUG analyze.fingerprint :: Saving fingerprint for Field 19 'LATITUDE' | |
11-16 09:09:40 DEBUG analyze.fingerprint :: Saving fingerprint for Field 20 'ADDRESS' | |
11-16 09:09:40 DEBUG analyze.fingerprint :: Saving fingerprint for Field 21 'ZIP' | |
11-16 09:09:40 DEBUG analyze.fingerprint :: Saving fingerprint for Field 22 'EMAIL' | |
11-16 09:09:40 DEBUG analyze.fingerprint :: Saving fingerprint for Field 23 'ID' | |
11-16 09:09:40 DEBUG analyze.fingerprint :: Saving fingerprint for Field 24 'NAME' | |
11-16 09:09:40 DEBUG analyze.fingerprint :: Saving fingerprint for Field 25 'PASSWORD' | |
11-16 09:09:40 DEBUG analyze.fingerprint :: Saving fingerprint for Field 26 'STATE' | |
11-16 09:09:40 DEBUG analyze.fingerprint :: Saving fingerprint for Field 27 'LONGITUDE' | |
11-16 09:09:40 DEBUG analyze.fingerprint :: Saving fingerprint for Field 28 'CREATED_AT' | |
11-16 09:09:40 DEBUG classifiers.name :: Based on the name of Field 16 'CITY', we're giving it a special type of :type/City. | |
11-16 09:09:40 DEBUG analyze.classify :: Based on classification, updating these values of Field 16 'CITY': {:special_type :type/City} | |
11-16 09:09:40 DEBUG classifiers.category :: Field 18 'SOURCE' has 5 distinct values. Since that is less than 300, we're marking it as a category. | |
11-16 09:09:40 DEBUG analyze.classify :: Based on classification, updating these values of Field 18 'SOURCE': {:special_type :type/Category} | |
11-16 09:09:40 DEBUG classifiers.name :: Based on the name of Field 19 'LATITUDE', we're giving it a special type of :type/Latitude. | |
11-16 09:09:40 DEBUG analyze.classify :: Based on classification, updating these values of Field 19 'LATITUDE': {:special_type :type/Latitude} | |
11-16 09:09:40 DEBUG classifiers.text-fingerprint :: Based on the fingerprint of Field 22 'EMAIL', we're marking it as :type/Email. | |
11-16 09:09:40 DEBUG analyze.classify :: Based on classification, updating these values of Field 22 'EMAIL': {:special_type :type/Email} | |
11-16 09:09:40 DEBUG classifiers.name :: Based on the name of Field 23 'ID', we're giving it a special type of :type/PK. | |
11-16 09:09:40 DEBUG classifiers.name :: Based on the name of Field 24 'NAME', we're giving it a special type of :type/Name. | |
11-16 09:09:40 DEBUG analyze.classify :: Based on classification, updating these values of Field 24 'NAME': {:special_type :type/Name} | |
11-16 09:09:40 DEBUG classifiers.name :: Based on the name of Field 26 'STATE', we're giving it a special type of :type/State. | |
11-16 09:09:40 DEBUG analyze.classify :: Based on classification, updating these values of Field 26 'STATE': {:special_type :type/State} | |
11-16 09:09:40 DEBUG classifiers.name :: Based on the name of Field 27 'LONGITUDE', we're giving it a special type of :type/Longitude. | |
11-16 09:09:40 DEBUG analyze.classify :: Based on classification, updating these values of Field 27 'LONGITUDE': {:special_type :type/Longitude} | |
11-16 09:09:40 INFO sync.analyze :: [*************************************·············] 😋 75% Analyzed Table 3 'PUBLIC.PEOPLE' | |
11-16 09:09:40 DEBUG analyze.table-row-count :: Set table row count for Table 4 'PUBLIC.REVIEWS' to 1078 | |
11-16 09:09:41 DEBUG analyze.fingerprint :: Saving fingerprint for Field 29 'REVIEWER' | |
11-16 09:09:41 DEBUG analyze.fingerprint :: Saving fingerprint for Field 30 'BODY' | |
11-16 09:09:41 DEBUG analyze.fingerprint :: Saving fingerprint for Field 31 'RATING' | |
11-16 09:09:41 DEBUG analyze.fingerprint :: Saving fingerprint for Field 32 'PRODUCT_ID' | |
11-16 09:09:41 DEBUG analyze.fingerprint :: Saving fingerprint for Field 33 'ID' | |
11-16 09:09:41 DEBUG analyze.fingerprint :: Saving fingerprint for Field 34 'CREATED_AT' | |
11-16 09:09:41 DEBUG analyze.classify :: Based on classification, updating these values of Field 30 'BODY': {:preview_display false} | |
11-16 09:09:41 DEBUG classifiers.name :: Based on the name of Field 31 'RATING', we're giving it a special type of :type/Category. | |
11-16 09:09:41 DEBUG analyze.classify :: Based on classification, updating these values of Field 31 'RATING': {:special_type :type/Category} | |
11-16 09:09:41 DEBUG classifiers.name :: Based on the name of Field 33 'ID', we're giving it a special type of :type/PK. | |
11-16 09:09:41 INFO sync.analyze :: [**************************************************] 😎 100% Analyzed Table 4 'PUBLIC.REVIEWS' | |
11-16 09:09:41 INFO sync.util :: FINISHED: Analyze data for h2 Database 1 'Sample Dataset' (5 s) | |
11-16 09:09:41 DEBUG sync.util :: Sync operations in flight: {:sync #{1}, :cache-field-values #{1}} | |
11-16 09:09:41 INFO sync.util :: STARTING: Cache field values in h2 Database 1 'Sample Dataset' | |
11-16 09:09:41 DEBUG sync.field-values :: Looking into updating FieldValues for Field 1 'VENDOR' | |
11-16 09:09:41 DEBUG sync.field-values :: Looking into updating FieldValues for Field 2 'TITLE' | |
11-16 09:09:41 DEBUG sync.field-values :: Looking into updating FieldValues for Field 3 'PRICE' | |
11-16 09:09:41 DEBUG sync.field-values :: Looking into updating FieldValues for Field 4 'CATEGORY' | |
11-16 09:09:41 DEBUG sync.field-values :: Looking into updating FieldValues for Field 5 'RATING' | |
11-16 09:09:41 DEBUG sync.field-values :: Looking into updating FieldValues for Field 6 'EAN' | |
11-16 09:09:41 DEBUG sync.field-values :: Looking into updating FieldValues for Field 10 'SUBTOTAL' | |
11-16 09:09:42 DEBUG sync.field-values :: Looking into updating FieldValues for Field 16 'CITY' | |
11-16 09:09:42 DEBUG sync.field-values :: Looking into updating FieldValues for Field 18 'SOURCE' | |
11-16 09:09:42 DEBUG sync.field-values :: Looking into updating FieldValues for Field 24 'NAME' | |
11-16 09:09:42 DEBUG sync.field-values :: Looking into updating FieldValues for Field 26 'STATE' | |
11-16 09:09:42 DEBUG sync.field-values :: Looking into updating FieldValues for Field 31 'RATING' | |
11-16 09:09:42 INFO sync.util :: FINISHED: Cache field values in h2 Database 1 'Sample Dataset' (853 ms) | |
11-16 09:09:42 INFO sync.util :: FINISHED: Sync h2 Database 1 'Sample Dataset' (7 s) | |
11-16 09:09:42 INFO metabase.core :: Metabase Initialization COMPLETE | |
11-16 09:10:38 INFO metabase.middleware :: Setting Metabase site URL to localhost:3000 | |
11-16 09:11:42 INFO driver.google :: Fetching Google access/refresh tokens with auth-code ... | |
11-16 09:11:46 INFO models.user :: Adding user 1 to All Users permissions group... | |
11-16 09:11:46 INFO models.user :: Adding user 1 to Admin permissions group... | |
11-16 09:11:46 DEBUG sync.util :: Sync operations in flight: {:sync #{2}} | |
11-16 09:11:46 INFO sync.util :: STARTING: Sync bigquery Database 2 'bq' | |
11-16 09:11:46 DEBUG sync.util :: Sync operations in flight: {:sync #{2}, :sync-metadata #{2}} | |
11-16 09:11:46 INFO sync.util :: STARTING: Sync metadata for bigquery Database 2 'bq' | |
11-16 09:11:48 INFO sync-metadata.tables :: Found new tables: (Table 'tupac_sightings') | |
11-16 09:11:49 INFO sync.util :: FINISHED: Sync metadata for bigquery Database 2 'bq' (3 s) | |
11-16 09:11:49 DEBUG sync.util :: Sync operations in flight: {:sync #{2}, :analyze #{2}} | |
11-16 09:11:49 INFO sync.util :: STARTING: Analyze data for bigquery Database 2 'bq' | |
11-16 09:11:51 DEBUG analyze.table-row-count :: Set table row count for Table 5 'tupac_sightings' to 1000 | |
11-16 09:11:52 DEBUG analyze.fingerprint :: Saving fingerprint for Field 35 'id' | |
11-16 09:11:52 DEBUG analyze.fingerprint :: Saving fingerprint for Field 36 'city_id' | |
11-16 09:11:52 DEBUG analyze.fingerprint :: Saving fingerprint for Field 37 'category_id' | |
11-16 09:11:52 DEBUG analyze.fingerprint :: Saving fingerprint for Field 38 'timestamp' | |
11-16 09:11:52 DEBUG classifiers.name :: Based on the name of Field 35 'id', we're giving it a special type of :type/PK. | |
11-16 09:11:52 DEBUG analyze.classify :: Based on classification, updating these values of Field 35 'id': {:special_type :type/PK} | |
11-16 09:11:52 DEBUG classifiers.category :: Field 36 'city_id' has 150 distinct values. Since that is less than 300, we're marking it as a category. | |
11-16 09:11:52 DEBUG analyze.classify :: Based on classification, updating these values of Field 36 'city_id': {:special_type :type/Category} | |
11-16 09:11:52 DEBUG classifiers.category :: Field 37 'category_id' has 15 distinct values. Since that is less than 300, we're marking it as a category. | |
11-16 09:11:52 DEBUG analyze.classify :: Based on classification, updating these values of Field 37 'category_id': {:special_type :type/Category} | |
11-16 09:11:52 INFO sync.analyze :: [**************************************************] 😎 100% Analyzed Table 5 'tupac_sightings' | |
11-16 09:11:52 INFO sync.util :: FINISHED: Analyze data for bigquery Database 2 'bq' (3 s) | |
11-16 09:11:52 DEBUG sync.util :: Sync operations in flight: {:sync #{2}, :cache-field-values #{2}} | |
11-16 09:11:52 INFO sync.util :: STARTING: Cache field values in bigquery Database 2 'bq' | |
11-16 09:11:52 DEBUG sync.field-values :: Looking into updating FieldValues for Field 36 'city_id' | |
11-16 09:11:54 DEBUG sync.field-values :: Looking into updating FieldValues for Field 37 'category_id' | |
11-16 09:11:55 INFO sync.util :: FINISHED: Cache field values in bigquery Database 2 'bq' (3 s) | |
11-16 09:11:55 INFO sync.util :: FINISHED: Sync bigquery Database 2 'bq' (9 s) | |
11-16 09:14:12 DEBUG sync.util :: Sync operations in flight: {:sync-metadata #{2}} | |
11-16 09:14:12 INFO sync.util :: STARTING: Sync metadata for bigquery Database 2 'bq' | |
11-16 09:14:16 INFO sync-metadata.fields :: Marking Table 5 'tupac_sightings' Field 'category_id' as inactive. | |
11-16 09:14:16 INFO sync.util :: FINISHED: Sync metadata for bigquery Database 2 'bq' (5 s) | |
11-16 09:14:49 WARN api.label :: Labels are deprecated, and this API endpoint will be removed in a future version of Metabase. | |
11-16 09:15:12 WARN metabase.util :: auto-retry metabase.driver.google$execute$fn__45845@54134fdf: Field 'hden_metabase_dev.tupac_sightings.category_id' not found in table 'hden_metabase_dev.tupac_sightings'. | |
11-16 09:15:13 WARN metabase.util :: auto-retry metabase.driver.google$execute$fn__45845@54134fdf: Field 'hden_metabase_dev.tupac_sightings.category_id' not found in table 'hden_metabase_dev.tupac_sightings'. | |
11-16 09:15:15 WARN metabase.util :: auto-retry metabase.driver.bigquery$process_native_STAR_$fn__46288@5681e90: Field 'hden_metabase_dev.tupac_sightings.category_id' not found in table 'hden_metabase_dev.tupac_sightings'. | |
11-16 09:15:16 WARN metabase.util :: auto-retry metabase.driver.google$execute$fn__45845@b539efb: Field 'hden_metabase_dev.tupac_sightings.category_id' not found in table 'hden_metabase_dev.tupac_sightings'. | |
11-16 09:15:17 WARN metabase.util :: auto-retry metabase.driver.google$execute$fn__45845@b539efb: Field 'hden_metabase_dev.tupac_sightings.category_id' not found in table 'hden_metabase_dev.tupac_sightings'. | |
11-16 09:15:19 WARN metabase.query-processor :: {:status :failed, | |
:class clojure.lang.ExceptionInfo, | |
:error "Field 'hden_metabase_dev.tupac_sightings.category_id' not found in table 'hden_metabase_dev.tupac_sightings'.", | |
:stacktrace | |
["driver.google$execute_no_auto_retry.invokeStatic(google.clj:34)" | |
"driver.google$execute_no_auto_retry.invoke(google.clj:27)" | |
"driver.google$execute$fn__45845.invoke(google.clj:45)" | |
"util$do_with_auto_retries.invokeStatic(util.clj:712)" | |
"util$do_with_auto_retries.invoke(util.clj:704)" | |
"util$do_with_auto_retries.invokeStatic(util.clj:716)" | |
"util$do_with_auto_retries.invoke(util.clj:704)" | |
"util$do_with_auto_retries.invokeStatic(util.clj:716)" | |
"util$do_with_auto_retries.invoke(util.clj:704)" | |
"driver.google$execute.invokeStatic(google.clj:44)" | |
"driver.google$execute.invoke(google.clj:38)" | |
"driver.bigquery$execute_bigquery.invokeStatic(bigquery.clj:124)" | |
"driver.bigquery$execute_bigquery.invoke(bigquery.clj:113)" | |
"driver.bigquery$execute_bigquery.invokeStatic(bigquery.clj:115)" | |
"driver.bigquery$execute_bigquery.invoke(bigquery.clj:113)" | |
"driver.bigquery$process_native_STAR_$fn__46288.invoke(bigquery.clj:187)" | |
"util$do_with_auto_retries.invokeStatic(util.clj:712)" | |
"util$do_with_auto_retries.invoke(util.clj:704)" | |
"util$do_with_auto_retries.invokeStatic(util.clj:716)" | |
"util$do_with_auto_retries.invoke(util.clj:704)" | |
"driver.bigquery$process_native_STAR_.invokeStatic(bigquery.clj:186)" | |
"driver.bigquery$process_native_STAR_.invoke(bigquery.clj:183)" | |
"driver.bigquery$execute_query.invokeStatic(bigquery.clj:286)" | |
"driver.bigquery$execute_query.invoke(bigquery.clj:282)" | |
"driver$fn__22732$G__22725__22739.invoke(driver.clj:34)" | |
"query_processor$execute_query.invokeStatic(query_processor.clj:50)" | |
"query_processor$execute_query.invoke(query_processor.clj:44)" | |
"query_processor.middleware.mbql_to_native$mbql__GT_native$fn__26980.invoke(mbql_to_native.clj:30)" | |
"query_processor.middleware.annotate_and_sort$annotate_and_sort$fn__25408.invoke(annotate_and_sort.clj:41)" | |
"query_processor.middleware.limit$limit$fn__26935.invoke(limit.clj:14)" | |
"query_processor.middleware.cumulative_aggregations$cumulative_aggregation$fn__26797.invoke(cumulative_aggregations.clj:46)" | |
"query_processor.middleware.cumulative_aggregations$cumulative_aggregation$fn__26797.invoke(cumulative_aggregations.clj:46)" | |
"query_processor.middleware.format_rows$format_rows$fn__26925.invoke(format_rows.clj:21)" | |
"query_processor.middleware.binning$update_binning_strategy$fn__25487.invoke(binning.clj:172)" | |
"query_processor.middleware.results_metadata$record_and_return_metadata_BANG_$fn__28204.invoke(results_metadata.clj:89)" | |
"query_processor.middleware.resolve$resolve_middleware$fn__25015.invoke(resolve.clj:359)" | |
"query_processor.middleware.expand$expand_middleware$fn__26691.invoke(expand.clj:550)" | |
"query_processor.middleware.add_row_count_and_status$add_row_count_and_status$fn__25099.invoke(add_row_count_and_status.clj:14)" | |
"query_processor.middleware.driver_specific$process_query_in_context$fn__26817.invoke(driver_specific.clj:12)" | |
"query_processor.middleware.resolve_driver$resolve_driver$fn__28215.invoke(resolve_driver.clj:14)" | |
"query_processor.middleware.cache$maybe_return_cached_results$fn__25568.invoke(cache.clj:146)" | |
"query_processor.middleware.catch_exceptions$catch_exceptions$fn__26739.invoke(catch_exceptions.clj:58)" | |
"query_processor$process_query.invokeStatic(query_processor.clj:126)" | |
"query_processor$process_query.invoke(query_processor.clj:122)" | |
"query_processor$run_and_save_query_BANG_.invokeStatic(query_processor.clj:234)" | |
"query_processor$run_and_save_query_BANG_.invoke(query_processor.clj:229)" | |
"query_processor$fn__28249$process_query_and_save_execution_BANG___28254$fn__28255.invoke(query_processor.clj:272)" | |
"query_processor$fn__28249$process_query_and_save_execution_BANG___28254.invoke(query_processor.clj:258)" | |
"api.dataset$fn__30139$fn__30142.invoke(dataset.clj:63)" | |
"api.common.internal$do_with_caught_api_exceptions.invokeStatic(internal.clj:248)" | |
"api.common.internal$do_with_caught_api_exceptions.invoke(internal.clj:243)" | |
"api.dataset$fn__30139.invokeStatic(dataset.clj:54)" | |
"api.dataset$fn__30139.invoke(dataset.clj:54)" | |
"middleware$enforce_authentication$fn__29528.invoke(middleware.clj:122)" | |
"api.routes$fn__43820.invokeStatic(routes.clj:61)" | |
"api.routes$fn__43820.invoke(routes.clj:61)" | |
"routes$fn__44498$fn__44499.doInvoke(routes.clj:75)" | |
"routes$fn__44498.invokeStatic(routes.clj:71)" | |
"routes$fn__44498.invoke(routes.clj:71)" | |
"middleware$log_api_call$fn__29627$fn__29629.invoke(middleware.clj:330)" | |
"middleware$log_api_call$fn__29627.invoke(middleware.clj:329)" | |
"middleware$add_security_headers$fn__29577.invoke(middleware.clj:245)" | |
"middleware$bind_current_user$fn__29532.invoke(middleware.clj:142)" | |
"middleware$maybe_set_site_url$fn__29581.invoke(middleware.clj:268)"], | |
:query | |
{:type "query", | |
:query {:source_table 5}, | |
:parameters [], | |
:constraints {:max-results 10000, :max-results-bare-rows 2000}, | |
:info | |
{:executed-by 1, | |
:context :ad-hoc, | |
:card-id nil, | |
:nested? false, | |
:query-hash [108, 84, 50, 34, 11, 38, 119, 50, -56, -60, 42, 94, 101, -64, 90, 33, 112, 60, 30, 100, 92, 60, -86, -3, -114, -10, -114, -29, 62, -24, -76, -68], | |
:query-type "MBQL"}}, | |
:expanded-query nil, | |
:ex-data | |
{"code" 400, | |
"errors" | |
[{"domain" "global", "location" "query", "locationType" "other", "message" "Field 'hden_metabase_dev.tupac_sightings.category_id' not found in table 'hden_metabase_dev.tupac_sightings'.", "reason" "invalidQuery"}], | |
"message" "Field 'hden_metabase_dev.tupac_sightings.category_id' not found in table 'hden_metabase_dev.tupac_sightings'."}} | |
11-16 09:15:19 WARN metabase.query-processor :: Query failure: Field 'hden_metabase_dev.tupac_sightings.category_id' not found in table 'hden_metabase_dev.tupac_sightings'. | |
["query_processor$assert_query_status_successful.invokeStatic(query_processor.clj:203)" | |
"query_processor$assert_query_status_successful.invoke(query_processor.clj:196)" | |
"query_processor$run_and_save_query_BANG_.invokeStatic(query_processor.clj:235)" | |
"query_processor$run_and_save_query_BANG_.invoke(query_processor.clj:229)" | |
"query_processor$fn__28249$process_query_and_save_execution_BANG___28254$fn__28255.invoke(query_processor.clj:272)" | |
"query_processor$fn__28249$process_query_and_save_execution_BANG___28254.invoke(query_processor.clj:258)" | |
"api.dataset$fn__30139$fn__30142.invoke(dataset.clj:63)" | |
"api.common.internal$do_with_caught_api_exceptions.invokeStatic(internal.clj:248)" | |
"api.common.internal$do_with_caught_api_exceptions.invoke(internal.clj:243)" | |
"api.dataset$fn__30139.invokeStatic(dataset.clj:54)" | |
"api.dataset$fn__30139.invoke(dataset.clj:54)" | |
"middleware$enforce_authentication$fn__29528.invoke(middleware.clj:122)" | |
"api.routes$fn__43820.invokeStatic(routes.clj:61)" | |
"api.routes$fn__43820.invoke(routes.clj:61)" | |
"routes$fn__44498$fn__44499.doInvoke(routes.clj:75)" | |
"routes$fn__44498.invokeStatic(routes.clj:71)" | |
"routes$fn__44498.invoke(routes.clj:71)" | |
"middleware$log_api_call$fn__29627$fn__29629.invoke(middleware.clj:330)" | |
"middleware$log_api_call$fn__29627.invoke(middleware.clj:329)" | |
"middleware$add_security_headers$fn__29577.invoke(middleware.clj:245)" | |
"middleware$bind_current_user$fn__29532.invoke(middleware.clj:142)" | |
"middleware$maybe_set_site_url$fn__29581.invoke(middleware.clj:268)"] |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment