How can I auto populate several excel sheets from other Excel files
I am currently working on a power bi dashboard that uses an excel file as a data source. I want to auto populate the excel file with new values from existing excel reports each day.
In the source file there are several sheets each with several columns (date, inventory, capacity of transportation, etc.)
My objective is to get specific column values from each excel report corresponding to today's date and add them to the corresponding columns in the source file (while adding a new row corresponding to today's date).
I have checked several articles on how to use Microsoft Azure and VBA but I didn't find exactly what I am looking for. Especially that the reports need some cleaning since they contain headers, titles, comments, etc.
Any ideas?
do you know?
how many words do you know
See also questions close to this topic
-
.Body & .HTMLBody Indent and Default Signature Issue
I've set up a macro to send an e-mail through Outlook.
.Body is read from a cell inside the file with indents. Since the value will change depending on the usage, I need to reference that cell for the body.
However, there rises 2 issues using .HTMLbody I lose indents which are constructed with CHAR(10) but I keep the default HTML signature.
When using just .BODY indents are displayed are correctly however the default signature is not constructed as HTML and I lose the images.
How should I go about fixing this issue?
My code:
sig = .HTMLBody body = xlSht.Range("B4").Value .To = xlSht.Range("B2").Value .CC = "" .Subject = xlSht.Range("B1").Value .body = body & sig .Display
I'd really appreciate your assistance.
Thanks.
-
Yahoo Finance no longer returns VBA cookie request for .getResponseHeader("Set-Cookie")
The following Excel VBA code segment has worked for years, but stopped working around 28 Apr 2022. I receive the responseText, but the .getResponseHeader("Set-Cookie") returns null.
Set httpReq = New WinHttp.WinHttpRequest DownloadURL = "https://finance.yahoo.com/lookup?s=" & stockSymbol With httpReq .Open "GET", DownloadURL, False .setRequestHeader "Content-Type", "application/x-www-form-urlencoded; charset=UTF-8" .Send .waitForResponse response = .responseText cookie = Split(.getResponseHeader("Set-Cookie"), ";")(0) End With
-
export excel rows to individual json files in python
My excel file has 500 rows of data. I am trying to get 500 individual JSON files. Each file should have data only from 1 row. Thank you in advance.
import json import pandas excel_data_df = pandas.read_excel("F:/2/N.csv.xlsx", sheet_name='Sheet1') json_str = excel_data_df.to_json(orient='records') for idx, row in enumerate(json_str): fpath = str(idx) + ".json" with open(fpath, "w+") as f: json.dump(row, f)
-
Pentaho spoon search and replace especial character in rows
I have a csv file with mime type US-ASCII and one column in the dataset look like this:
id V_name 210001 cha?ne des Puys 210030 M?los 213004 G?ll? 213021 S?phan 221110 Afd?ra And so on.
I would like to change those characters to:
id V_name 210001 chaine des Puys 210030 Milos 213004 Gollu 213021 Suphan 221110 Afdera The thing is that there are 95 rows of this kind, how can I search and replace those rows? I using the suite PDI spoon. Thanks in advance.
-
Join large set of CSV files where the header is the timestamp for the file
I have a large set of CSV files. Approx. 15 000 files. And would like to figure out how to join them together as one file for data processing.
Each file is in a simple pattern with timestamp that corresponds to a period of time that represent the data in the each CSV file.
Ex.
file1.csv
2021-07-23 08:00:00 Unit.Device.No03.ErrorCode;11122233 Unit.Device.No04.ErrorCode;0 Unit.Device.No05.ErrorCode;0 Unit.Device.No11.ErrorCode;0
file2.csv
2021-07-23 08:15:00 Unit.Device.No03.ErrorCode;0 Unit.Device.No04.ErrorCode;44556666 Unit.Device.No05.ErrorCode;0 Unit.Device.No11.ErrorCode;0
Each file starts with the timestamp. I would like to join all the files in a directory, and transpose the "Unit.Device" to columns. And then use the original header as a timestamp column. For each file add a new row with the corresponding "ErrorCode" to each column.
Like this:
Timestamp;Unit.Device.No03.ErrorCode;Unit.Device.No04.ErrorCode;Unit.Device.No05.ErrorCode.. 2021-07-23 08:00:00;11122233;0;0;0;0.... 2021-07-23 08:15:00;0;44556666;0;0;0....
Any simple tools for this, or Python routines?
-
Can you use a variable to represent a slicer selection in PowerBi?
I am having a problem displaying a sum value in a gauge visual. My table has many rows with a value in each along with a date. I wrote a measure that calculates the sum of the values filtered by the type of product and whether it was sold during the day or night. The problem is there are many rows of values with the same date so when I put a date drop down slicer on the page, the measure doesn't return a value unless I make the date slicer a "range" which I don't want. I am assuming that the measure doesn't know which value to return when the slicer is in the drop down mode. How can I make this work? Can I use two variables in the DAX code for the measure that provide a date "range" based on the selection in the date drop down that the user chooses?? See below for my current code.
Table:
DATE | Product Sold | Day or Night -------------------------------------------------- 4/1/2022 1000 N 4/1/2022 500 D 4/1/2022 800 N 4/1/2022 2000 D 4/2/2022 900 N Measure = CALCULATE( SUM('Table'[Product Sold]), 'Table'[Day or Night] IN { "N" })
So the gauge visual won't display a value based on this measure when using a drop down date slicer. It should display 1,800 when the drop down date slicer has a date of 4/1/2022
The gauge visual WILL display the sum value when using a date range slicer but I can't use that type.
I hope this makes sense. There has to be a way around it to make it work. I'm at a loss.
-
DAX error when creating calculated column using Lookupvalue function
I am creating a calculated column in Power BI using multiple if and Lookupvalue functions,
using Lookupvalue function to return "Yes" if
SKU
exits inSpare SKU
tableI am getting an error on the 3rd line: Function 'LOOKUPVALUE' does not support comparing values of type Text with values of type True/False. Consider using the VALUE or FORMAT function to convert one of the values.
below is the code, please advise.
Spare SKU = IF('Inventory Raw Data'[site_id]="1111" || 'Inventory Raw Data'[site_id]="2222", IF('Inventory Raw Data'[Material Type] = "ZZZZ","Yes", IF(LOOKUPVALUE('Spare SKU '[ERP_SKU],'Spare SKU '[ERP_SKU],'Inventory Raw Data'[ERP_SKU]=BLANK()),"No","Yes")),"No")
-
TextField populate from the browser with incorrect font size
I have a simple page, user text field and password, with font size of 30 (for the question only for the user field).
For some reason, when the browser (chrome) populate the field it's render with wrong font size (till the text field become in focus):
As you can see in the picture: Left - auto fill, wrong size; Right - correct font size, after focusing the field.
The code (also in sandbox):
import * as React from "react"; import TextField from "@mui/material/TextField"; export default function ThemeVariables() { const [name, setName] = React.useState(""); const [psw, setPsw] = React.useState(""); return ( <div> <h2>Name</h2> <TextField inputProps={{ style: { fontSize: 30 } }} InputLabelProps={{ style: { fontSize: 30 } }} onChange={(event) => setName(event.target.value)} value={name} /> <h2>Psw</h2> <TextField type="password" value={psw} onChange={(event) => setPsw(event.target.value)} /> <button onClick={() => console.log(name + " " + psw)}>Submit</button> </div> ); }
(you can keep the password via the key icon in your browser, right to the url address).
Any idea, how to populate in the right font size?
P.S. The small font size behavior even reproduce while textField in focus and the user hover on the saved passwords:
-
cordova webview processing autofill form values cannot be parsed or is out of range
I have read many threads on this issue, but everything is dealing with standard browsers and webkits, virtually nothing is related to app webviews that run on cordova/ionic (or other) android and iOS.
I have finally gotten form autofill to work on my app webforms for both Android and iOS, the problem is I need to try to sanitize the info before its added to the input field or assigned to the ng-model - I am not certain which is causing this issue.
My form field:
<input type="number" placeholder="Mobile Number" inputmode="numeric" pattern="[0-9]*" id="user_phone" autocomplete="tel-national" ng-model="regObj.user_phone" ng-keypress="monitorLength($event,'user_phone',phoneNumLengths,1)" ng-blur="verifyLength('user_phone',phoneNumLengths)" />
I have various functions that are detecting length to ensure proper phone numbers are entered, helping reduce errors.
In the above example, that is designed for a number that is 10 numbers long (no spaces or dashes are allowed), the autofill window pops up with a top option for
+12025551212
. If the users selects this option, as it is their phone number, then it causes the following warning and the input field is not populated:The specified value "+12025551212" cannot be parsed, or is out of range
In this case, the
+
is not numeric and the fact the string is 12 characters long is double trouble since the input field is limited to 10. Thus I don't know ifng-model
is causing the error or if inputtype=number
is causing the issue - or both.Either way, how can I capture the selected autofill value to try to parse the value to strip out the
+1
or at a minimum detect the value is out of range to alert the user. -
fill textbox without detection
i made an mozilla extension to fill form. but the site owner detected i am filling the form programmatically . after all textboxes value set my extension click on data confirmation button but all textboxes value will delete and alarm me fill them again . when i try fill them manually with my fingers , it works. i am using
element.value = "sth";
please show me another way to fill textboxes to avoid detection . ( i trieddispatchEvent
but it wont update textbox ) -
Architecture for tracking data changes in application DB required for warehousing
Overview
I have an OLTP DB that stores application transaction data and acts as the source of truth for the current state of the application. I would like to have my DWH store historical data so I can do analyses that compare previous states of the application to the current state. In Kimball terminology, I would like Type 2 dimensions for my SCDs.
In the application DB, changes to the dimensions are not tracked but rather updated in place for efficiency reasons. So, if I were to ETL directly from my application DB once per day, I would be losing historical data potentially. My question is how can I track these data changes for the sake of warehousing? Here are some options I have considered:
Option #1: Ingest Event Data Into Data Lake
Whenever anything happens in the application an event is emitted. These events can capture all the information I need for the warehouse. My thoughts are you could emit the events using something like Apache Kafka, and have some process listen for the events and store them in a raw form in a data lake that's completely immutable. Then, you would use an ETL process that works from the data lake instead of the application DB to load up the warehouse.
Pros
- Can achieve real-time data analysis if required in the future (currently not necessary)
- An immutable data lake can act as a foundation for other types of analytics such as ML or other warehouses
- The data lake serves as a single source of truth for all data which will be nice for the future when there are multiple application DBs and other sources of data ingestion
Cons
- Requires an event processing/streaming service which is more overhead to maintain
- Data can be lost/duplicated causing the lake to not reflect the application DB
- Requires storing data in two places which is more developmental overhead
Option #2: Batch Process Application DB Snapshots
In this scenario, I would use the daily snapshots of the DB as a source for ETL'ing into the DWH. Historical data would be at the grain of how often the snapshot takes place (daily in this case). This would mean that an change data that happens within the day would be lost. However, it may not be that important to store such fine-grained data anyways.
Pros
- Data storage is not duplicated between two places
- No extra infrastructure is required as daily snapshots are already automatically obtained and stored in S3
- Data integrity is maintained because we're working directly with the current application state so we can stay in-sync better
Cons
- Requires a delta calculation against the previous snapshot to determine what new dimension objects need to be imported (this may actually be required in any scenario but seems more natural to do with the event architecture)
- The grain of the historical data is coupled to the frequency at which snapshots occur
- Only compatible with ETL'ing into DWH and would not be as useful for ML/data science applications that work well with raw data
Option #3: Batch Process Log Data
Instead of emitting an event to something like Apache Kafka, store the event data in a temporary log file on the application server. Then, the ETL process would involve scanning all the application servers to grab and process the log files. These log files would store extensive transaction history, enough to get all the data required for the DWH.
Pros
- Logging events to a file is easy from a developmental standpoint on the application server
- A full transaction history can be stored so we don't lose any information
- Minimal performance impact on application server
Cons
- Data reliability is lower than the other two options because application servers can be torn down at any moment to accomadate scaling which would lead to log files getting lost (also server can crash)
- Processing log data requires extra parsing logic
- ETL would need to work directly against server instances which may require service discovery
-
What is the best strategy to store redis data to MySQL for permanent storage?
I am running a couple of crawlers that produce millions of datasets per day. The bottleneck is the latency between the spiders and the remote database. In case the location of the spider server is too large, the latency will slow the crawler down to a point where it can not longer complete the datasets needed for a day.
In search for a solution I came upon redis with the idea on installing redis the spider server where it will temporarily store the data collected with low latency and then redis will pull that data to mysql some how.
The setup is like this until now:
- About 40 spiders running on multiple instances feed one central MySQL8 remote server on a dedicated machine over TCP/IP.
- Each spider writes different datasets, one kind of spider gets positions and prices of search results, where there are 100 results with around 200-300 inserts on one page. Delay is about 2-10s between the next request/page.
The later one is the problem as the spider yields every position within that page and creates a remote insert within a transaction, maybe even a connect (not sure at the moment).
This currently only works as spiders and remote MySQL server are close (same data center) with ping times of 0.0x ms, it does not work with ping times of 50ms as the spiders can not write fast enough.
Is redis or maybe DataMQ a valid approach to solve the problem or are there other recommended ways of doing this?
-
Fivetran Shopify Connector: Would like to extract and load the raw data from Shopify - which schema destination is the right one?
Which scheme to choose here to get all the raw data from my Shopify store? The problem is, there is no exact description of what schemes are available?