TECHNOLOGY stringclasses 22
values | QUESTION stringlengths 26 425 | SOLUTION stringlengths 34 2.68k |
|---|---|---|
Azure Synapse | The query references an object that is not supported in distributed
processing mode | Some objects, like system views, and functions can't be used while you
query data stored in Azure Data Lake or Azure Cosmos DB analytical storage. Avoid using the queries that join external data with system views, load external data in a temp table, or use some security or metadata functions to filter external data. |
Azure Synapse | Query returning NULL values instead of partitioning columns or can't find
the partition columns | troubleshooting steps:
If you use tables to query a partitioned dataset, be aware that tables don't support partitioning. Replace the table with the partitioned views.
If you use the partitioned views with the OPENROWSET that queries partitioned files by using the FILEPATH() function, make sure you correctly specified... |
Azure Synapse | Missing column when using automatic schema inference | You can easily query files without knowing or specifying schema, by
omitting WITH clause. In that case column names and data types will be inferred from the files. Have in mind that if you are reading number of files at once, the schema will be inferred from the first file service gets from the storage. This can mean ... |
Azure Synapse | Failed to execute query. Error: CREATE EXTERNAL
TABLE/DATA SOURCE/DATABASE SCOPED CREDENTIAL/FILE FORMAT is not supported in master database. | 1. Create a user database:
CREATE DATABASE <DATABASE_NAME>
2. Execute a CREATE statement in the context of <DATABASE_NAME>, which failed earlier for the master database.
Here's an example of the creation of an external file format:
USE <DATABASE_NAME>
CREATE EXTERNAL FILE FORMAT [SynapseParquetFormat]
WITH ( FORMAT... |
Azure Synapse | Getting an error while trying to create a new Azure AD login or user
in a database | check the login you used to connect to your database. The login that's trying to create a new Azure AD user must have permission to access the Azure AD domain and check if the user exists. Be aware that:
SQL logins don't have this permission, so you'll always get this error if you use SQL authentication.
If you use an... |
Azure Synapse | Resolving Azure Cosmos DB path has failed with error 'This request is not
authorized to perform this operation'. | check to see if you used private endpoints in Azure Cosmos DB. To allow
serverless SQL pool to access an analytical store with private endpoints, you must configure private endpoints for the Azure Cosmos DB analytical store. |
Azure Synapse | Delta table created in Spark is not shown in serverless pool | If you created a Delta table in Spark, and it is not shown in the serverless SQL pool, check the following:
1. Wait some time (usually 30 seconds) because the Spark tables are synchronized with delay.
2. If the table didn't appear in the serverless SQL pool after some time, check the schema of the Spark Delta table. S... |
GCP Cloud Storage - Web App | Failed to fetch metadata from the registry, with
reason: generic::permission_denied | To resolve this issue, grant the Storage Admin role to the service account:
To see which account you used, run the gcloud auth list command.
To learn why assigning only the App Engine Deployer (roles/appengine.deployer) role might not be sufficient in some cases, see App Engine roles. |
GCP Cloud Storage - Web App | Error: The App Engine appspot and App Engine flexible environment
service accounts must have permissions on the image IMAGE_NAME | This error occurs for one of the following reasons:
1. The default App Engine service account does not have the Storage Object Viewer (roles/storage.objectViewer) role.
To resolve this issue, grant the Storage Object Viewer role to the service account.
2. Your project has a VPC Service Perimeter which limits acce... |
GCP Cloud Storage - Web App | Failed to create cloud build: Permission denied | This error occurs if you use the gcloud app deploycommand from an account that does not have the Cloud Build Editor (roles/cloudbuild.builds.editor) role.
To resolve this issue, grant the Cloud Build Editor role to the service account that you are using to deploy your app.
To see which account you used, run the gclou... |
GCP Cloud Storage - Web App | Timed out waiting for the app infrastructure to become healthy | To resolve this issue, rule out the following potential causes:
1. Verify that you have granted the Editor (roles/editor) role to your default App Engine service account.
2. Verify that you have granted the following roles to the service account that you use to run your application (usually the default service account... |
GCP Cloud Storage - Web App | Invalid value error when deploying in a Shared VPC setup | To resolve the issue, remove the instance_tag field from app.yaml and
redeploy. |
GCP Cloud Run | Container failed to start. Failed to start and then listen on the port
defined by the PORT environment variable. | To resolve this issue, rule out the following potential causes:
Verify that you can run your container image locally. If your container image cannot run locally, you need to diagnose and fix the issue locally first.
Check if your container is listening for requests on the expected port as documented in the container ... |
GCP Cloud Run | The server has encountered an internal error. Please try again later.
Resource readiness deadline exceeded. | This issue might occur when the Cloud Run service agent does not exist, or when it does not have the Cloud Run Service Agent (roles/run.serviceAgent) role.
To verify that the Cloud Run service agent exists in your Google Cloud project and has the necessary role, perform the following steps:
Open the Google Cloud cons... |
GCP Cloud Run | Can I run Cloud Run applications on a private IP? | "Currently no. Cloud Run applications always have a *.run.app public hostname and they cannot be placed inside a VPC (Virtual Private Cloud) network.
If any other private service (e.g. GCE VMs, GKE) needs to call your Cloud Run application, they need to use this public hostname.
With ingress settings on Cloud Run, yo... |
GCP Cloud Run | The service has encountered an error during container import. Please try again later. Resource readiness deadline exceeded. | To resolve this issue, rule out the following potential causes:
1. Ensure container's file system does not contain non-utf8 characters.
2. Some Windows based Docker images make use of foreign layers. Although Container Registry doesn't throw an error when foreign layers are present, Cloud Run's control plane does not... |
GCP Cloud Run | The request was not authorized to invoke this service | To resolve this issue:
1. If invoked by a service account, the audience claim (aud) of the Google-signed ID token must be set to the following:
i. The Cloud Run URL of the receiving service, using the form https://service-xyz.run.app.
The Cloud Run service must require authentication.
The Clou... |
GCP Cloud Run | The request was not authenticated. Either allow unauthenticated
invocations or set the proper Authorization header | To resolve this issue:
1. If the service is meant to be invocable by anyone, update its IAM settings to make the service public.
2. If the service is meant to be invocable only by certain identities, make sure that you invoke it with the proper authorization token.
i. If invoked by a developer or invoked by an end... |
GCP Cloud Run | HTTP 429
The request was aborted because there was no available instance.
The Cloud Run service might have reached its maximum container instance
limit or the service was otherwise not able to scale to incoming requests.
This might be caused by a sudden increase in traffic, a long container startup time or a long reque... | To resolve this issue, check the "Container instance count" metric for
your service and consider increasing this limit if your usage is nearing the maximum. See "max instance" settings, and if you need more instances, request a quota increase. |
GCP Cloud Run | This might be caused by a sudden increase in traffic, a drawn-out container setup process, or a drawn-out request processing process. | To resolve this issue, address the previously listed issues.
In addition to fixing these issues, as a workaround you can implement exponential backoff and retries for requests that the client must not drop.
Note that a short and sudden increase in traffic or request processing time might only be visible in Cloud Moni... |
GCP Cloud Run | HTTP 500 / HTTP 503: Container instances are exceeding memory limits | To resolve this issue:
1. Determine if your container instances are exceeding the available memory. Look for related errors in the varlog/system logs.
2. If the instances are exceeding the available memory, consider increasing the memory limit.
Note that in Cloud Run, files written to the local filesystem count toward... |
GCP Cloud Run | HTTP 503: Unable to process some requests due to high concurrency setting | To resolve this issue, try one or more of the following:
1. Increase the maximum number of container instances for your service.
2. Lower the service's concurrency. Refer to setting concurrency for more detailed instructions. |
GCP Cloud Run | HTTP 504
The request has been terminated because it has reached the maximum request timeout. | To troubleshoot this issue, try one or more of the following:
1. Instrument logging and tracing to understand where your app is spending time before exceeding your configured request timeout.
2. Outbound connections are reset occasionally, due to infrastructure updates. If your application reuses long-lived connectio... |
GCP Cloud Run | asyncpg.exceptions.ConnectionDoesNotExistError: connection was
closed in the middle of operation | To resolve this issue:
1. If you are trying to perform background work with CPU throttling, try using the "CPU is always allocated" CPU allocation setting.
2. Ensure that you are within the outbound requests timeouts. If your application maintains any connection in an idle state beyond this thresholds, the gateway ne... |
GCP Cloud Run | assertion failed: Expected hostname or IPv6 IP enclosed in [] but got
<IPv6 ADDRESS> | To resolve this issue:
To change the environment variable value and resolve the issue, set ENV SPARK_LOCAL_IP="127.0.0.1" in your Dockerfile. In Cloud Run, if the variable SPARK_LOCAL_IP is not set, it will default to its IPv6 counterpart instead of localhost. Note that setting RUN export SPARK_LOCAL_IP="127.0.0.1" wi... |
GCP Cloud Run | mount.nfs: access denied by server while mounting
IP_ADDRESS:/FILESHARE | If access was denied by the server, check to make sure the file share
name is correct. |
GCP Cloud Run | mount.nfs: Connection timed out | If the connection times out, make sure you are providing the correct
IP address of the filestore instance. |
GCP Cloud Run | How can I specify Google credentials in Cloud Run applications? | For applications running on Cloud Run, you don't need to deliver JSON keys for IAM Service Accounts, or set GOOGLE_APPLICATION_CREDENTIALS environment variable.
Just specify the service account (--service-account) you want your application to use automatically while deploying the app. See configuring service identity. |
GCP Cloud Run | How to do canary or blue/green deployments on Cloud Run? | If you updated your Cloud Run service, you probably realized it creates a new revision for every new configuration of your service.
Cloud Run allows you to split traffic between multiple revisions, so you can do gradual rollouts such as canary deployments or blue/green deployments. |
GCP Cloud Run | How to configure secrets for Cloud Run applications? | You can use Secret Manager with Cloud Run. Read how to write code and set permissions to access the secrets from your Cloud Run app in the documentation.
Alternatively, if you'd like to store secrets in Cloud Storage (GCS) using Cloud KMS envelope encryption, check out the Berglas tool and library (Berglas also has su... |
GCP Cloud Run | How to connect IPs in a VPC network from Cloud Run? | Cloud Run now has support for "Serverless VPC Access". This feature allows Cloud Run applications to be able to connect private IPs in the VPC (but not the other way).
This way your Cloud Run applications can connect to private VPC IP addresses running:
GCE VMs
Cloud SQL instances
Cloud Memorystore instances
Kubernet... |
GCP Cloud Run | How can I serve responses larger than 32MB with Cloud Run? | Cloud Run can stream responses that are larger than 32MB using HTTP chunked encoding. Add the HTTP header Transfer-Encoding: chunked to your
response if you know it will be larger than 32MB. |
GCP Security IAM | How can I use Multi Factor Authentication (MFA) with IAM? | When individual users use MFA, the methods they authenticate with
will be honored. This means that your own identity system needs to support MFA. For Google Workspace accounts, this needs to be enabled by the user themselves. For Google Workspace-managed credentials, MFA can be enabled with Google Workspace tools. |
GCP Security IAM | How do I control who can create a service account in my project? | Owner and editor roles have permissions to create service accounts in
a project. If you wish to grant a user the permission to create a service account, grant them the owner or the editor role. |
GCP Security IAM | How do I grant permissions to resources in my project to someone who
is not part of my organization? | Using Google groups, you can add a user outside of your organization to a group and bind that group to the role. Note that Google groups don't have login credentials, and you cannot use Google groups to establish identity to make a request to access a resource.
You can also directly add the user to the allow policy ev... |
GCP Security IAM | How can I manage who can access my instances? | To manage who has access to your instances, use Google groups to
grant roles to principals. Granting a role creates a role binding in an allow policy; you can grant the role on the project where the instances will be launched, or on individual instances. If a user (identified by their Google Account, for example, my-u... |
GCP Security IAM | How do I list the roles associated with a gcp service account? | To see roles per service account in the console:
1. Copy the email of your service account (from IAM & Admin -> Service Accounts - Details);
2. Go to: IAM & Admin -> Policy Analyzer -> Custom Query;
3. Set Parameter 1 to Principal. Paste the email into Principal field;
4. Click Continue, then click Run Query.
You'll g... |
GCP Security IAM | GCP Cloud Build fails with permissions error even though correct role is
granted | you need to add the cloudfunctions.developer and iam.serviceAccountUser roles to the [PROJECT_NUMBER]@cloudbuild.gserviceaccount.com account, and (I believe) that the aforementioned cloudbuild service account also needs to be added as a member of the service account that has permissions to deploy your Cloud Function (a... |
GCP Security IAM | How to set Google Cloud application credentials for a Service Account | gcloud auth application-default login uses the active|specified user account to create a local JSON file that behaves like a service account.
The alternative is to use gcloud auth activate-service-account but, as you know, you will need to have the service account's credentials as these will be used instead of the cre... |
GCP Security IAM | Is there a way to list all permissions from a user in GCP? | In Google Cloud Platform there is no single command that can do this. Permissions via roles are assigned to resources. Organizations, Folders, Projects, Databases, Storage Objects, KMS keys, etc can have IAM permissions assigned to them. You must scan (check IAM permissions for) every resource to determine the total se... |
GCP Security IAM | Can't delete a Google Cloud Project | 1. see your project retentions: gcloud alpha resource-manager liens list
2. if you have any retention delete: gcloud alpha resource-manager liens delete "name"
3. delete your project gcloud projects delete "project" |
GCP Security IAM | How to read from a Storage bucket from a GCE VM with no External IP? | You simply have to:
1. Go to Console -> VPC network
2. Choose the subnet of your VM instance (for example default -> us-central1)
3. Edit and select Private Google access -> On. Then save.
Also make sure that your VM has access to the Cloud Storage API. |
GCP Security IAM | I'm getting the error "cannot use role (type string) as type
"cloud.google.com/go/iam".RoleName in argument to policy.HasRole. | You can use type conversion as the following:
return policy.HasRole(serviceAccount, iam.RoleName(role))
Or simpler by declaring role as iam.RoleName
func checkRole(key, serviceAccount, role iam.RoleName) bool {
...
return policy.HasRole(serviceAccount, role)
} |
GCP Security IAM | Can I get a list of all resources for which a user has been added to a
role? | Roles are not assigned directly to users. This is why there is no single command that you can use.
IAM members (users, service accounts, groups, etc.) are added to resources with roles attached. A user can have permissions to a project and also have permissions at an individual resource (Compute Engine Instance A, Sto... |
GCP Security IAM | Is there a way to prevent deletion of a google spanner database even though developers have been granted broad (i.e. owner) access to the project? | A few approaches.
1. If you're worrying about a Spanner Database getting dropped, you can use the --enable-drop-protection flag when creating the DB, to ensure it cannot be accidentally deleted.
2. You can make negative permissions through IAM Deny Policies in Google Cloud, to expressedly prevent someone, like a deve... |
GCP Security IAM | How to grant access to all service account in organization? | You can use Google groups which uses a collection of user and/or
service accounts. Once this is done, add the service accounts to the Google group and then assign the necessary IAM roles to the Google group. |
GCP Security IAM | How to restrict BigQuery's dataset access for everyone having (Project
level Viewer) role | The solution here is to have Terraform (or something else) manage the resources for you.
You can develop a module that creates the appropriate things for a user e.g. a dataset, a bucket, some perms, a service account etc.
That way all you need to do is add another user to your list and re-deploy. The other additional... |
GCP Security IAM | Hoe do I Custom Role for Inserting to Specific BigQuery Dataset | You can drop the bigquery.datasets.get permission from the custom
IAM role so that they can’t list all the datasets, and then in the dataset's permissions give the READER role instead of WRITER to the user for that specific dataset. |
GCP Security IAM | Service account does not have permission to access Firestore | Creating a service account by itself grants no permissions. The Permissions tab in IAM & Admin > Service Accounts shows a list of "Principals with access to this account" - this is not the inheritance of permissions, it's simply which accounts, aka principals, can make use of the permissions granted to this service acc... |
GCP Security IAM | How to connect to Cloud SQL from Azure Data Studio using an IAM user | We can connect using IAM database authentication using the Cloud SQL Auth proxy. The only step after to be done from the GUI DB tool (mine is Azure Data Studio) would be, to connect to the IP (127.0.0.1 in my case)the Cloud SQL Auth proxy listens on(127.0.0.1 is the default) after starting the Cloud SQL Auth proxy usin... |
GCP Security IAM | What is the correct GCP user role that I should assign to my external website developer? | you should grant the minimum role level to execute the work. If your developer only need access to the Translation API, you can grant his account with this role: Cloud Translation API Editor.
If you want him to have full access to the Cloud Translation resources, you can gran him the Cloud Translation API Admin.
In c... |
GCP Security IAM | How to restrict access to triggering HTTP CLoud Function via trigger URL? | The problem is your access method. You are using your own user account (who has the Cloud FUnction invoker role) but with your browser. Your request with your browser is without any authentication header.
If you want to call your cloud function now, you have to add an authorization header, and an identity token as bea... |
GCP Security IAM | What roles do my Cloud Build service account need to deploy an http
triggered unauthenticated Cloud Function? | The solution is replace Cloud Functions Developer role with Cloud Functions Admin role.
Use of the --allow-unauthenticated flag modifies IAM permissions. To ensure that unauthorized developers cannot modify function permissions, the user or service that is deploying the function must have the cloudfunctions.functions.... |
GCP Big Query | Getting error as billingNotEnabled | Enable billing for the project in the Google Cloud console. |
GCP Big Query | How to create temporary table in Google BigQuery | To create a temporary table, use the TEMP or TEMPORARY keyword when you use the CREATE TABLE statement and use of CREATE TEMPORARY TABLE requires a script , so its better to start with begin statement.
Begin
CREATE TEMP TABLE <table_name> as select * from <table_name> where <condition>;
End ; |
GCP Big Query | How to download all data in a Google BigQuery dataset? | Detailed step-by-step to download large query output
1. enable billing
You have to give your credit card number to Google to export the output, and you might have to pay.
But the free quota (1TB of processed data) should suffice for many hobby projects.
2. create a project
3. associate billing to a project
4... |
GCP Big Query | How to generate date series to occupy absent dates in google
BiqQuery? | Generting a list of dates and then joining whatever table you need on top seems the easiest. I used the generate_date_array + unnest and it looks quite clean.
To generate a list of days (one day per row):
SELECT
*
FROM
UNNEST(GENERATE_DATE_ARRAY('2018-10-01', '2020-09-30', INTERVAL 1 DAY)) AS example |
GCP Big Query | How many Google Analytics views can I export to BigQuery? | You can only export one view per Google Analytics property.
When selecting which view to export, it is important to consider which views have been customized with various changes to the View Settings (traffic
filters, content groupings, channel settings, etc.), or which views have the most historical data.
The view ... |
GCP Big Query | How to choose the latest partition in BigQuery table? | You can use with statement, select last few partitions and filter out the result. This is better approach because:
You are not limited by fixed partition date (like today - 1 day). It will always take the latest partition from given range.
It will only scan last few partitions and not whole table.
Example with last 3 ... |
GCP Big Query | How can I change the project in BigQuery | You have two ways to do it:
1. Specify --project_id global flag in bq. Example: bq ls -j --project_id <PROJECT>
2. Change default project by issuing gcloud config set project <PROJECT> |
GCP Big Query | How to catch a failed CAST statement in BigQuery SQL? | You can use the SAFE_CAST function, which returns NULL if the input
is not a valid value when interpreted as the desired type. In your case, you would just use SAFE_CAST(UPDT_DT_TM AS DATETIME). It is in the Functions & Operators documentation. |
GCP Big Query | JSON formatting Error when loading into Google Big Query | Yes, BigQuery only accepts new-line delimited JSON, which means
one complete JSON object per line. Before you merge the object to one line, BigQuery reads "{", which is start of an object, and expects to read a key, but the line ended, so you see the error message "expected key".
For multiple JSON objects, just put t... |
GCP Big Query | I am trying to run the query "select * from tablename ". But it throws
error like "Error: Response too large to return". | Set allowLargeResults to true in your job configuration. You must also specify a destination table with the allowLargeResults flag.
If querying via API,
"configuration":
{
"query":
{
"allowLargeResults": true,
"query": "select uid from [project:dataset.table]"
"destinationTable": [projec... |
GCP Big Query | How can I refresh datasets/resources in the new Google BigQuery Web
UI? | f you click the search box in the project/dataset "Explorer" sidebar,
then press enter, it will refresh the list. |
GCP Big Query | Failed to save view. Bad table reference "myDataset.myTable"; table
references in standard SQL views require explicit project IDs | Your view has reference to myDataset.myTable - which is ok when you just run it as a query (for example in Web UI).
But to save it as a view you must fully qualify that reference as below
myProject.myDataset.myTable
So, just add project to that reference |
GCP Big Query | Bigquery Error: UPDATE/MERGE must match at most one source row for
each target row | It occurs because the target table of the BigQuery contains duplicated row(w.r.t you are joining). If a row in the table to be updated joins with more than one row from the FROM clause, then BigQuery returns this error:
Solution
1. Remove the duplicated rows from the target table and perform the UPDATE/MERGE operatio... |
GCP Big Query | Create a BigQuery table from pandas dataframe, WITHOUT specifying
schema explicitly | Here's a code snippet to load a DataFrame to BQ:
import pandas as pd
from google.cloud import bigquery
# Example data
df = pd.DataFrame({'a': [1,2,4], 'b': ['123', '456', '000']})
# Load client
client = bigquery.Client(project='your-project-id')
# Define table name, in format dataset.table_name
table = 'your-datase... |
GCP Big Query | Table name missing dataset while no default dataset is set in the
request | Depending on which API you are using, you can specify the defaultDataset parameter when running your BigQuery job. More information for the jobs.query api can be found here https://cloud.google.com/bigquery/docs/reference/rest/v2/jobs/query.
For example, using the NodeJS API for createQueryJob https://googleapis.dev/n... |
GCP Big Query | Is there an easy way to convert rows in BigQuery to JSON? | If you want to glue together all of the rows quickly into a JSON block, you can do something like:
SELECT CONCAT("[", STRING_AGG(TO_JSON_STRING(t), ","), "]")
FROM `project.dataset.table` t
This will produce a table with 1 row that contains a complete JSON blob summarizing the entire table. |
GCP Big Query | How do I list tables in Google BigQuery that match a certain name? | You can do something like below in BigQuery Legacy SQL
SELECT *
FROM publicdata:samples.__TABLES__
WHERE table_id CONTAINS 'github'
Or with BigQuery Standard SQL
SELECT *
FROM publicdata.samples.__TABLES__
WHERE starts_with(table_id, 'github') |
GCP Big Query | BigQuery fails to save view that uses functions | BigQuery now supports permanents registration of UDFs. In order to use your UDF in a view, you'll need to first create it.
CREATE OR REPLACE FUNCTION `ACCOUNT-NAME11111.test.STR_TO_TIMESTAMP`
(str STRING)
RETURNS TIMESTAMP AS (PARSE_TIMESTAMP('%Y-%m-%dT%H:%M:%E*SZ', str));
i. Note that you must use a ... |
GCP Big Query | Query Failed Error: Resources exceeded during query execution: The
query could not be executed in the allotted memory | The only way for this query to work is by removing the ordering applied in the end:
SELECT
fullVisitorId,
CONCAT(CAST(fullVisitorId AS string),CAST(visitId AS string)) AS session,
date,
visitStartTime,
hits.time,
hits.page.pagepath
FROM
`XXXXXXXXXX.ga_sessions_*`,
UNNEST(hits) AS hits
WHERE
_TABLE_S... |
GCP Big Query | How to convert results returned from bigquery to Json format using
Python? | There is no current method for automatic conversion, but there is a pretty simple manual method to convert to json:
records = [dict(row) for row in query_job]
json_obj = json.dumps(str(records))
Another option is to convert using pandas:
df = query_job.to_dataframe()
json_obj = df.to_json(orient='records') |
GCP VM | Getting an error when connecting to VM using the SSH-in-browser from the Google Cloud console | To resolve this issue, have a Google Workspace admin do the following:
1. Confirm that Google Cloud is enabled for the organization.
If Google Cloud is disabled, enable it and retry the connection.
2. Confirm that services that aren't controlled individually are enabled.
If these services are disabled, enable them ... |
GCP VM | The following error is occuring when I start an SSH session:
Could not connect, retrying … | To resolve this issue, do the following:
1. After the VM has finished booting, retry the connection. If the connection is not successful, verify that the VM did not boot in emergency mode by running the following command:
gcloud compute instances get-serial-port-output VM_NAME \
| grep "emergency mode"
If the VM boots... |
GCP VM | The SSH connection failed after upgrading the VM's kernel. | To resolve this issue, do the following:
1. Mount the disk to another VM.
2. Update the grub.cfg file to use the previous version of the kernel.
3. Attach the disk to the unresponsive VM.
4. Verify that the status of the VM is RUNNING by using the gcloud 5. compute instances describe command.
5. Reinstall the kernel.
... |
GCP VM | Connection via Cloud Identity-Aware Proxy Failed | To resolve this issue Create a firewall rule on port 22 that allows ingress
traffic from Identity-Aware Proxy. |
GCP VM | ERROR: (gcloud.compute.ssh) Could not SSH into the instance.
It is possible that your SSH key has not propagated to the instance yet.
Try running this command again. If you still cannot connect, verify that the firewall and instance are set to accept ssh traffic. | This error can occur for several reasons. The following are some of the most common causes of the errors:
1. You tried to connect to a Windows VM that doesn't have SSH installed.
To resolve this issue, follow the instructions to Enable SSH for Windows on a running VM.
2. The OpenSSH Server (sshd) isn't running or is... |
GCP VM | ERROR: (gcloud.compute.ssh) FAILED_PRECONDITION: The specified
username or UID is not unique within given system ID. | This error occurs when OS Login attempts to generate a username that already exists within an organization. This is common when a user account is deleted and a new user with the same email address is created shortly after. After a user account is deleted, it takes up to 48 hours to remove the user's POSIX information.
... |
GCP VM | Error message:
"code": "RESOURCE_OPERATION_RATE_EXCEEDED",
"message": "Operation rate exceeded for resource 'projects/project-id/zones/zone-id/disks/disk-name'. Too frequent operations from the source resource." | Resolution:
To create multiple disks from a snapshot, use the snapshot to create an image then create your disks from the image:
Create an image from the snapshot.
Create persistent disks from the image. In the Google Cloud console, select Image as the disk Source type. With the gcloud CLI, use the image flag. In the... |
GCP VM | Error message:
The resource 'projects/PROJECT_NAME/zones/ZONE/RESOURCE_TYPE/RESOURCE_NAME' already exists" | Resolution: Retry your creation request with a unique resource name. |
GCP VM | Error message:
Could not fetch resource:
- The selected machine type (MACHINE_TYPE) has a required CPU platform of REQUIRED_CPU_PLATFORM.
The minimum CPU platform must match this, but was SPECIFIED_CPU_PLATFORM. | Resolution:
1. To learn about which CPU platform your machine type supports, review CPU platforms.
2. Retry your request with a supported CPU platform. |
GCP VM | Error Message:
Invalid value for field 'resource.sourceMachineImage': Updating 'sourceMachineImage' is not supported | Resolution:
1. Make sure that your VM supports the processor of the new machine type. For more information about the processors supported by different machine types, see Machine family comparison.
2. Try to change the machine type by using the Google Cloud CLI. |
GCP VM | ERROR: Registration failed: Registering system to registration proxy https://smt-gce.susecloud.net
command '/usr/bin/zypper --non-interactive refs Python_3_Module_x86_64' failed
Error: zypper returned 4 with 'Problem retrieving the repository index file for service 'Python_3_Module_x86_64':
Timeout exceeded when acces... | To resolve this issue, review the Cloud NAT configuration to verify
that the minimum ports per VM instance parameter is set to at least 160. |
GCP VM | ERROR: (gcloud.compute.instances.set-machine-type) Could not fetch
resource:
Invalid resource usage: 'Requested boot disk architecture (X86_64) is not compatible with machine type architecture (ARM64).' | Resolution:
Make sure that your VM supports the processor of the new machine type. For more information about the processors supported by different machine types, see Machine family comparison.
Try to change the machine type by using the Google Cloud CLI.
If you switch from an x86 machine type to an Arm T2A machine ... |
GCP VM | using an unapproved resource "Machine type architecture (ARM64) is not compatible with requested boot disc architecture (X86_64)," the notification states. | To resolve this issue, try one of the following:
1. If you are using a zonal MIG, use a regional MIG instead.
2. Create multiple MIGs and split your workload across them—for example by adjusting your load balancing configuration.
3. If you still need a bigger group, contact support to make a request. |
GCP VM | Can't move a VM to a sole-tenant node. | Solution:
1. A VM instance with a specified minimum CPU platform can't be moved to a sole-tenant node by updating VM tenancy. To move a VM to a sole-tenant node, remove the minimum CPU platform specification by setting it to automatic.
2. Because each sole-tenant node uses a specific CPU platform, all VMs running on ... |
GCP VM | Error Message:No feasible nodes found for the instance given its node affinities and other constraints. | Specify values for the minimum number of CPUs for each VM so that
the total for all VMs does not exceed the number of CPUs specified by the sole-tenant node type. |
GCP Fire Store | ABORTED ERROR:
Too much contention on these datastore entities. Please try again. | To resolve this issue:
1. For rapid traffic increases, Firestore attempts to automatically scale to meet the increased demand. When Firestore scales, latency begins to decrease.
2. Hot-spots limit the ability of Firestore to scale up, review designing for scale to identify hot-spots.
3. Review data contention in trans... |
GCP Fire Store | RESOURCE_EXHAUSTED Error:
Some resource has been exhausted, perhaps a per-user quota, or perhaps the entire file system is out of space. | To resolve this issue:
Wait for the daily reset of your free tier quota or enable billing for your project. |
GCP Fire Store | INVALID_ARGUMENT: The value of property field_name is longer than
1048487 bytes | To resolve this issue:
1. For indexed field values, split the field into multiple fields. If possible, create an un-indexed field and move data that doesn't need to be indexed into the un-indexed field.
2. For un-indexed field values, split the field into multiple fields or implement compression for the field value. |
GCP Fire Store | Firestore : “Error: 9 FAILED_PRECONDITION: The Cloud Firestore API is
not available for Cloud Datastore projects” [duplicate] | Three solutions:
1. Firestore is not set as your Datastore
Go to https://console.cloud.google.com/firestore/. You'll notice a popup saying you need to initialize Firestore as the Native Datastore. Once done you should see this
2. You are logged into the wrong account in GCloud SDK.
you're on localhost - In your termi... |
GCP Fire Store | I am trying to create a Vue Composable that uploads a file to Firebase Storage.
To do this I am using the modular Firebase 9 version.
But my current code does not upload anything, and instead returns this error: FirebaseError: Firebase Storage: An unknown error occurred, please check the error payload for server respon... | To fix that take these steps:
1. Go to https://console.cloud.google.com
2. Select your project in the top blue bar (you will probably need to switch to the "all" tab to see your Firebase projects)
3. Scroll down the left menu and select "Cloud Storage"
4. Select all your buckets then click "Show INFO panel" in the top... |
GCP Fire Store | How can I fix Firebase/firestore error in React native? | Issue was fixed by downgrading Firebase to version 6.0.2. Cleaning project's cache was the solution.
Cleaning instructons:
In /android folder run ./graglew clean.
Also use https://www.npmjs.com/package/react-native-clean-project package. |
GCP Fire Store | Firestore error : Stream closed with status : PERMISSION_DENIED | Replace your rules with this and try:
rules_version = '2';
service cloud.firestore {
match /databases/{database}/documents {
match /{multiSegment=**} {
allow read, write;
}
}
} |
GCP Fire Store | How can I fix my firestore database setup error? | Most likely snapshot.docChanges() is an empty array, so
snapshot.docChanges()[0].doc.data() then fails. You'll want to check for an empty result set before accessing a member by its index like that. |
GCP Fire Store | how do I fix my flutter app not building with cloud firestore? | I had the same issue and noticed, that my firebase_core dependency in pubspec.yaml was not updated.
Now use firebase_core: ^1.20.0 and it works
Do not forget to run flutter clean. |
GCP Fire Store | How do I fix "Could not reach Cloud Firestore Backend" error? | If you are using Android Studio, Go to
AVD Manager
Your virtual devices
Drop down by the right-hand side of the device
Wipe Data
Cold Boot
This should fix your issue |
GCP Fire Store | How to solve FirebaseError: Expected first argument to collection() to
be a CollectionReference, a DocumentReference or FirebaseFirestore problem? | You need to use in your imports either:
'firebase/firestore'
OR
'firebase/firestore/lite'
Not both in the same project.
In your case, the firebase.ts file is using:
import { getFirestore } from 'firebase/firestore/lite'
And in your hook:
import { doc, onSnapshot, Unsubscribe } from 'firebase/firestore'
So you're i... |
GCP Fire Store | I am getting error while uploading date data to firestore in flutter | Firebase uses ISO8061 format to save dates. Let us say your b'day is 08-11-2004 so your code would be so
final date = DateTime(2004, 11, 8).toIso8601String();
Now you can upload the date variable into firebase as Date format. |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.