Trying to install xgboost is failing..? The version is Anaconda 2.1.0 (64-bit) on Windows & enterprise. How do I proceed? I have been using R it seems its quite easy to... moreTrying to install xgboost is failing..? The version is Anaconda 2.1.0 (64-bit) on Windows & enterprise. How do I proceed? I have been using R it seems its quite easy to install new package in R from RStudio, but not so in spyder as I need to go to a command-window to do it and then in this case it fails..
import sys print (sys.version) 2.7.8 |Anaconda 2.1.0 (64-bit)| (default, Jul 2 2014, 15:12:11) C:\anaconda\Lib\site-packages>pip install -U xgboost Downloading/unpacking xgboost Could not find a version that satisfies the requirement xgboost (from versions: 0.4a12, 0.4a13) Cleaning up... No distributions matching the version for xgboost Storing debug log for failure in C:\Users\c_kazum\pip\pip.log ------------------------------------------------------------ C:\Users\c_kazum\AppData\Local\Continuum\Anaconda\Scripts\pip-script.py run on 08/27/15 12:52:30 Downloading/unpacking xgboost Getting page https://pypi.python.org/simple/xgboost/ URLs to search for versions for xgboost: *... less
I'm trying to train a word embedding classifier using TF2.4 with Keras and using the tf.nn.sampled_softmax_loss. However, when calling the fit method of the model, "Cannot convert... moreI'm trying to train a word embedding classifier using TF2.4 with Keras and using the tf.nn.sampled_softmax_loss. However, when calling the fit method of the model, "Cannot convert a symbolic Keras input/output to a numpy array" TypeError occurs. Please help me to fix the error or with an alternative approach to do candidate sampling.
import tensorflow as tf
import numpy as np
I'm taking this course on Coursera, and I'm running some issues while doing the first assignment. The task is to basically use regular expression to get certain values from the... moreI'm taking this course on Coursera, and I'm running some issues while doing the first assignment. The task is to basically use regular expression to get certain values from the given file. Then, the function should output a dictionary containing these values:
example_dict = {"host":"146.204.224.152",
"user_name":"feest6811",
"time":"21/Jun/2019:15:45:24 -0700",
"request":"POST /incentivize HTTP/1.1"}
This is just a screenshot of the file. Due to some reasons, the link doesn't work if it's not open directly from Coursera. I apologize in advance for the bad formatting. One thing I must point out is that for some cases, as you can see in the first example, there's no username. Instead '-' is used.
159.253.153.40 - - "POST /e-business HTTP/1.0" 504 19845
136.195.158.6 - feeney9464 "HEAD /open-source/markets HTTP/2.0" 204 21149
This is what I currently have right now. However, the output is None. I guess there's something wrong in my... less
I want to setup a hadoop-cluster in pseudo-distributed mode. I managed to perform all the setup-steps, including startuping a Namenode, Datanode, Jobtracker and a Tasktracker on... moreI want to setup a hadoop-cluster in pseudo-distributed mode. I managed to perform all the setup-steps, including startuping a Namenode, Datanode, Jobtracker and a Tasktracker on my machine.
Then I tried to run some exemplary programms and faced the java.net.ConnectException: Connection refused error. I stepped back to the very first steps of running some operations in standalone mode and faced the same problem.
I performed even triple-check of all the installation steps and have no idea how to fix it. (I am new to Hadoop and a beginner Ubuntu user thus I kindly ask you for "taking it into account" if providing any guide or tip).
This is the error output I keep receiving:
hduser@marta-komputer:/usr/local/hadoop$ bin/hadoop jar share/hadoop/mapreduce/hadoop-mapreduce-examples-2.6.0.jar grep input output 'dfs+'
15/02/22 18:23:04 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
15/02/22 18:23:04 INFO client.RMProxy:... less
I have a problem with calculating CPC in Tableau. I have the cost and the number of the click but Tableau is not calculating the right CPC. the formula I used : I attached two... moreI have a problem with calculating CPC in Tableau. I have the cost and the number of the click but Tableau is not calculating the right CPC. the formula I used : I attached two tables in this request. first shows the table which I calculated all KPIs in Zeppelin. the second the calculation in Tableau.
The whole data set has many null and 0 values, but it is the same data set used in zeppelin.
May I ask for help,how to solve this issue?The result of CPC is not correct in Tableau.
I'm relatively new to GCP and just starting to setup/evaluate my organizations architecture on GCP.
Scenario:Data will flow into a pub/sub topic (high frequency, low amount of... moreI'm relatively new to GCP and just starting to setup/evaluate my organizations architecture on GCP.
Scenario:Data will flow into a pub/sub topic (high frequency, low amount of data). The goal is to move that data into Big Table. From my understanding you can do that either with a having a cloud function triggering on the topic or with Dataflow.
Now I have previous experience with cloud functions which I am satisfied with, so that would be my pick.
I fail to see the benefit of choosing one over the other. So my question is when to choose what of these products?
Thanks less
I am newer to Hadoop, and want to know what is the differences between Hadoop-common, Hadoop-core and Hadoop-client?
By the way,for a given class, how do I know which... moreI am newer to Hadoop, and want to know what is the differences between Hadoop-common, Hadoop-core and Hadoop-client?
By the way,for a given class, how do I know which artifact contains it in Maven ? For example, which one contains the org.apache.hadoop.io.Text?
I am not a database expert and have no formal computer science background, so bear with me. I want to know the kinds of real world negative things that can happen if you use an... moreI am not a database expert and have no formal computer science background, so bear with me. I want to know the kinds of real world negative things that can happen if you use an old MongoDB version prior to v4, which were not ACID compliant. This applies to any ACID noncompliant database.
I understand that MongoDB can perform Atomic Operations, but that they don't "support traditional locking and complex transactions", mostly for performance reasons. I also understand the importance of database transactions, and the example of when your database is for a bank, and you're updating several records that all need to be in sync, you want the transaction to revert back to the initial state if there's a power outage so credit equals purchase, etc.
But when I get into conversations about MongoDB, those of us that don't know the technical details of how databases are actually implemented start throwing around statements like:
MongoDB is way faster than MySQL and Postgres, but there's a tiny chance, like... less
When we have to predict the value of a categorical (or discrete) outcome we use logistic regression. I believe we use linear regression to also predict the value of an... moreWhen we have to predict the value of a categorical (or discrete) outcome we use logistic regression. I believe we use linear regression to also predict the value of an outcome given the input values.
Then, what is the difference between the two methodologies?
I am writing an automation test script using Robot Framework & Selenium2Library for testing our web application( in .txt format) . One of my test cases involves to check the CSS... moreI am writing an automation test script using Robot Framework & Selenium2Library for testing our web application( in .txt format) . One of my test cases involves to check the CSS style attribute of an HTML tag.
Is there any specific keyword in Robot Framework to obtain the CSS style attribute of an html element?
Here is my testing scenario:
<div id="check_style" style="width:20px;height:20px;background-color:#ffcc00;"></div>
Now, I have to store the background color of this particular html tag into a variable ${bg_color}. Is there any specific keyword in Robot Framework to do this process?
Can you please suggest an effective way to handle this situation?
I think we can make use of this javascript function for the above mentioned purpose :
document.getElementById("check_style").styleBut how to make use of this particular function to store the value of background-color inot a variable ${bg_color} ?( I have tried to execute ${bg_color} = Execute Javascript document.getElementById("check_style").style, but... less
I am writing an automation test script using Robot Framework & Selenium2Library for testing our web application( in .txt format) . One of my test cases involves to check the CSS... moreI am writing an automation test script using Robot Framework & Selenium2Library for testing our web application( in .txt format) . One of my test cases involves to check the CSS style attribute of an HTML tag.
Is there any specific keyword in Robot Framework to obtain the CSS style attribute of an html element?
Here is my testing scenario:
<div id="check_style" style="width:20px;height:20px;background-color:#ffcc00;"></div>
Now, I have to store the background color of this particular html tag into a variable ${bg_color}. Is there any specific keyword in Robot Framework to do this process?
Can you please suggest an effective way to handle this situation?
I think we can make use of this javascript function for the above mentioned purpose :
document.getElementById("check_style").styleBut how to make use of this particular function to store the value of background-color inot a variable ${bg_color} ?( I have tried to execute ${bg_color} = Execute Javascript document.getElementById("check_style").style, but... less
what is difference between Azure Internet of things suites and Internet of things hubs and its usage? Please Tell me basics of how .NET works in Internet of things. Thanks for the help!
I'm trying to understand the relationship of the number of cores and the number of executors when running a Spark job on YARN.
The test environment is as follows:
Number of data... moreI'm trying to understand the relationship of the number of cores and the number of executors when running a Spark job on YARN.
The test environment is as follows:
Number of data nodes: 3
Data node machine spec:
CPU: Core i7-4790 (# of cores: 4, # of threads: 8)
RAM: 32GB (8GB x 4)
HDD: 8TB (2TB x 4)
We have several DocuSign Accounts in our organization. We are planning to build Tableau dashboard that gets usage/billing data from all accounts. DocuSign is using OAuth Token for... moreWe have several DocuSign Accounts in our organization. We are planning to build Tableau dashboard that gets usage/billing data from all accounts. DocuSign is using OAuth Token for authentication. Not sure how to authenticate in Tableau. Could you help us with this?
https://developers.docusign.com/esign-rest-api/reference/Billing/Invoices/get
I can't find any command to uninstall and remove all PyTorch dependencies. Even on the pytorch.org website.
I installed PyTorch with
conda install pytorch torchvision cuda80 -c soumith
I need to write the equivalent of the following code in R but I'm not quite sure how to go about it:
def add(args):
result = args + args
return... moreI need to write the equivalent of the following code in R but I'm not quite sure how to go about it:
def add(args):
result = args + args
return result
The reason why is because for the platform I am using (Cloudera Data Science Workbench) models need a JSON input to be able to call them using an API key
So if I write a test model in R such as:
f <- function(x, y) {
return (x + y)
}
I cannot do a call like {"x" : 2, "y" : 4} using the httr package.
So I either need to make a dictionary like call for functions in R
OR
I am simply calling JSON incorrectly in which case could someone help me format that correctly for an API call
Thanks less
I am trying to implement one sample word count program using Hadoop. I have downloaded and installed Hadoop 2.0.0. I want to do this sample program using Eclipse because i think... moreI am trying to implement one sample word count program using Hadoop. I have downloaded and installed Hadoop 2.0.0. I want to do this sample program using Eclipse because i think later in my real project I have to use Eclipse only.
I am not able to find Hadoop related jar files like hadoop-core.jar and other required jar files. I searched in all the folders of 2.0 hadoop but couldn't find those files. Those same files are available in 1.0 version of Hadoop but not in 2.0 version. I would like to know where can I get these files?
I am not able to find much information about 2.0 version.
please help less
so I happened to receive an xlms file that contains names of individuals with different titles such as Mr, Ms, Dr, Mrs, Judge etc. However some of these names contains multiple... moreso I happened to receive an xlms file that contains names of individuals with different titles such as Mr, Ms, Dr, Mrs, Judge etc. However some of these names contains multiple titles within one name example "Mr Mrs Ronderval", "Dr Rev Johns Mr" etc. So am trying to remove all of them except for one, hence the final result should be Mr Ronderval or Mrs Ronderval, Dr Johns or Rev Johns or Mr Johns any of them will be fine. So far what i have done is to convert the strings into a list of lists such as name_list = , and have a list of titles title=. So i tried to iterate through the name_list removing all values from titles and the result obviously is "Roderval" and "Johns" but i want atleast one title to be left in the name Mr Ronderval or Mrs Ronderval, Dr Johns or Rev Johns or Mr Johns. How do i go about this?
Here is my code using list comprehension
name_list=
Lastly, I started to learn neural networks and I would like know the difference between Convolutional Deep Belief Networks and Convolutional Networks. In here, there is a similar... moreLastly, I started to learn neural networks and I would like know the difference between Convolutional Deep Belief Networks and Convolutional Networks. In here, there is a similar question but there is no exact answer for it. We know that Convolutional Deep Belief Networks are CNNs + DBNs. So, I am going to do an object recognition. I want to know which one is much better than other or their complexity. I searched but I couldn't find anything maybe doing something wrong.
I have a spark streaming application which produces a dataset for every minute. I need to save/overwrite the results of the processed data.
When I tried to overwrite the dataset... moreI have a spark streaming application which produces a dataset for every minute. I need to save/overwrite the results of the processed data.
When I tried to overwrite the dataset org.apache.hadoop.mapred.FileAlreadyExistsException stops the execution.
I set the Spark property set("spark.files.overwrite","true") , but there is no luck.
How to overwrite or Predelete the files from spark?
I cannot understand how the example in the PyTorch documentation corresponds to the explanation:
Returns a new tensor with a dimension of size one inserted at the specified... moreI cannot understand how the example in the PyTorch documentation corresponds to the explanation:
Returns a new tensor with a dimension of size one inserted at the specified position.
>>> x = torch.tensor()
>>> torch.unsqueeze(x, 0)
tensor()
>>> torch.unsqueeze(x, 1)
tensor(,
,
,
)
I am getting the following error while starting namenode for latest hadoop-2.2 release. I didn't find winutils exe file in hadoop bin folder. I tried below commands
$ bin/hdfs... moreI am getting the following error while starting namenode for latest hadoop-2.2 release. I didn't find winutils exe file in hadoop bin folder. I tried below commands
$ bin/hdfs namenode -format
$ sbin/yarn-daemon.sh start resourcemanager
ERROR util.Shell (Shell.java:getWinUtilsPath(303)) - Failed to locate the winutils binary in the hadoop binary path
java.io.IOException: Could not locate executable null\bin\winutils.exe in the Hadoop binaries.
at org.apache.hadoop.util.Shell.getQualifiedBinPath(Shell.java:278)
at org.apache.hadoop.util.Shell.getWinUtilsPath(Shell.java:300)
at org.apache.hadoop.util.Shell.<clinit>(Shell.java:293)
at org.apache.hadoop.util.StringUtils.<clinit>(StringUtils.java:76)
at org.apache.hadoop.yarn.server.resourcemanager.ResourceManager.main(ResourceManager.java:863) less