I'm confronting a strange situation here with Google Cloud SQL.
I'm migrating a 15.7Gb mysql database to Google cloud. I've followed the migration process exactly as the doc says.... moreI'm confronting a strange situation here with Google Cloud SQL.
I'm migrating a 15.7Gb mysql database to Google cloud. I've followed the migration process exactly as the doc says. And everything worked perfectly. Absolutely no issue during the process, my application works just fine. The only problem here is that the size used by the DB shown on Google Cloud is much bigger that the original DB. Right now I have a 39Gb sql database, from a 15.7Gb database.
After some research and testing I've come to the conclusion that it's the way that Google count the data on their side.
I just wanted to know if somebody have any idea, or can confirm what I'm saying.
Thank you for your answers. less
I just got into SQL to do some data science and was wondering why my code was running but not affecting the MySQL database in any way. I am using pycharm and the MySQLdb module.
... moreI just got into SQL to do some data science and was wondering why my code was running but not affecting the MySQL database in any way. I am using pycharm and the MySQLdb module.
import MySQLdb
db = MySQLdb.connect(host="localhost",
user="root",
passwd="********", #Password blocked
db="test")
cur = db.cursor()
cur.execute("SELECT * FROM movies")
cur.execute("Update movies set genre = 'action' where id = 1")
for row in cur.fetchall() :
print row, " ", row, " ", row
My code runs and returns no errors, but when I delete the
cur.execute("Update movies set genre = 'action' where id = 1")
line it just prints out the table the as it was before. Just for reference, here is the table:
1 Interstellar sci-fi
2 Thor: Ragnarok action
3 Thor: The Dark World action
How can I make the commands in python actually affect the table? Thank you so much for your help! less
I am wanting to start a data warehouse in Google Big Query but I'm not sure how to actually schedule jobs to get the data into the cloud.
To give some background. I have a MySQL... moreI am wanting to start a data warehouse in Google Big Query but I'm not sure how to actually schedule jobs to get the data into the cloud.
To give some background. I have a MySQL database hosted on-prem which I currently take a demp of each night as a backup. My idea is that I can send this dump to the Google Cloud and have it import the data into Big Query. I have thought that I could send the dump and probably use a cloud scheduler function to then run something that opens the dump and does this but I'm unsure how these services all fit together.
I'm a bit of a newby with the Google Cloud so if there is a better way to achieve this then I'm happy to change my plan of action.
Thanks in advance. less
I am using hadoop-1.2.1 and sqoop version is 1.4.4.
I am trying to run the following query.
sqoop import --connect jdbc:mysql://IP:3306/database_name --table clients --target-dir... moreI am using hadoop-1.2.1 and sqoop version is 1.4.4.
I am trying to run the following query.
sqoop import --connect jdbc:mysql://IP:3306/database_name --table clients --target-dir /data/clients --username root --password-file /sqoop.password -m 1
sqoop.password is a file which is kept on HDFS in path /sqoop.password with permission 400.
It is giving me an error
Access denied for user 'root'@'IP' (using password: YES)
Can anyone provide solution for this?
Does anyone know a good (preferably open source and cross platform) tool to allow simple visualization of mysql databases? I just need a tool I can quickly point at a database and... moreDoes anyone know a good (preferably open source and cross platform) tool to allow simple visualization of mysql databases? I just need a tool I can quickly point at a database and it'll show basic table structure and field types, etc. Nothing too advanced or crazy.