express js & PostgreSQL Put request inserts null in undefined fields
I make an edit profile request where is possible to edit one or more fields. And when i make put request and want change only one field, every undefined field becomes null. How to pass undefined field? Every not mentioned field in request saves its value
//update profile
app.put("/id/:id", async (req,res) => {
try{
const {id} = req.params;
const {username, email, userdescription, photo_url} = req.body;
const ifEmailExists = await isEmailAvailable(email)
const ifUsernameExists = await isUsernameAvailable(username)
if(ifEmailExists && ifUsernameExists){
const updUser = await pool.query(
"UPDATE users SET (username, email, userdescription, photo_url) = ($1, $2, $3, $4) WHERE id = $5",
[username, email, userdescription, photo_url, id]);
res.json(`User data was updated successfully !`);
} else { if(!ifUsernameExists&& !ifEmailExists){
res.json({"msg": "Username and email are already used !"});}
else {
if(!ifEmailExists)
res.json({"msg": "Email is already used !"});
else { if(!ifUsernameExists){
res.json({"msg": "Username is already used !"});}
}
}
}
}catch(err){
console.error(err.message);
}
})
do you know?
how many words do you know
See also questions close to this topic
-
Error message : MongoServerError: bad auth :Authentication failed
What is the reason behind this error? this code I am using to connect to DB.
const uri =`mongodb+srv://${process.env.DB_USER}:${process.env.DB_PASSWORD}@cluster0.xft2s.mongodb.net/myFirstDatabase?retryWrites=true&w=majority`;
-
In Mongo, If a document I'm saving "Prateek" then I don't want on the next create operation even the "prateek" or "praTEEK", etc is saved
//** If I'm adding a new document with the name: "India", then I don't want that the DB allow another name with the name: "INDIA", "india", "indIA", etc. I'm new and learning, help would be great!!**
// Controller
var Dinosaur = require('../models/dinosaurs'); //addDino module.exports.addDino = (req, res) => { var name = req.body.name; var type = req.body.type; var height = req.body.height; var weight = req.body.weight; var Period = req.body.Period; req.checkBody('name', 'Name is required').notEmpty(); var errors = req.validationErrors(); if (errors) return res.status(400).send({ message: 'Name is Required' }); else { let newDino = { name: name, type: type, height: height, weight: weight, Period: Period } Dinosaur.addDino(newDino, (err, result) => { if (err) { if (err.name) return res.status(409).send({ message: name + ' Already Exist' }); else if (err.url) return res.json({ status: false, error: { url: "Url already exist" }, message: err.url }); else return res.json(err, "Server Error"); } else { return res.status(200).send({ message: "Done" }); } }); } }
// Model
var mongoose = require('mongoose'); //dinosaur schema var DinosaurSchema = mongoose.Schema({ name: { type: String, unique: true }, type: { type: String }, height: { type: Number }, weight: { type: Number }, Period: { type: String } }); var Dinosaur = mongoose.model('dinosaur', DinosaurSchema); //add module.exports.addDino = (query, callback) => { Dinosaur.create(query, callback); }
// GetAll, Already Created a new document with the name "Brachiosaurus"
// > Create, a new create with the first letter lower case "brachiosaurus", Don't want it to be pushed.
- i am trying to run the node of ganache but getting this error
-
ERROR: invalid byte sequence for encoding WITH psql
I've seen numerous issues in other posts with the copy command and:
ERROR: invalid byte sequence for encoding "UTF8": 0xfc
And the consensus in these posts appears to be to specify the encoding in the command you're doing the copy with. I have done so:
psql -h localhost -p 5432 -d BOBDB -U BOB -c "\COPY \"BOBTB01\" FROM 'C:\Temp\file.csv' with csv HEADER ENCODING 'WIN1252'"; Password for user BOB: **ERROR: character with byte sequence 0x81 in encoding "WIN1252" has no equivalent in encoding "UTF8" CONTEXT: COPY BOBTB01, line 76589**
So, that confused me and I changed to UTF8 to WIN1252 and having done so I get a slightly different error, the failure is on a different line and the text is slightly different.
psql -h localhost -p 5432 -d BOBDB -U BOB -c "\COPY \"BOBTB01\" FROM 'C:\Temp\file.csv' with csv HEADER ENCODING 'UTF8'"; Password for user BOB: **ERROR: invalid byte sequence for encoding "UTF8": 0xfc CONTEXT: COPY BOBTB01, line 163**
This is the encoding shown in the database:
show client_encoding; client_encoding ----------------- UTF8 (1 row)
The file is from a reliable source and I happen to have "R" installed which also does .csv import. The file was pulled into "R" without issue, that's making me think it's not the file but something else. Is there another switch or syntax that can bypass these issues perhaps?
I'm not sure what is wrong.
Can you help?
Thanks.
-
Whats missing on my Ruby 'Inverse Of' relationship
I know this topic has been addressed, but I have been at this for 2 days and I'm just stuck. I know inverse of does not create a new query, so should I use another method?
Question: How to set up an 'inverse of' with a has_one, belongs_to situation & same class..
Explanation: A user 'has_one :spouse' and 'belongs_to :spouse_from'. They are inverse of each other. When a User signs up, they can invite their significant other. For Example
- user_a invites & creates user_b
- user_b.spouse_id is set to user_a.id
- In a separate method I want to be able to update like.. user_a.spouse_id = user_a.spouse.id
The only association that works at this point is user_b.spouse.
Class User has_one :spouse, class_name: 'User', foreign_key: :spouse_id, dependent: :nullify, inverse_of: :spouse_from belongs_to :spouse_from, class_name: 'User', foreign_key: :spouse_id, inverse_of: :spouse, optional: true
-
Normalizing data in postgresql
Flag This application will read an iTunes library in comma-separated-values (CSV) and produce properly normalized tables as specified below. Once you have placed the proper data in the tables, press the button below to check your answer.
We will do some things differently in this assignment. We will not use a separate "raw" table, we will just use ALTER TABLE statements to remove columns after we don't need them (i.e. we converted them into foreign keys).
We will use the same CSV track data as in prior exercises - this time we will build a many-to-many relationship using a junction/through/join table between tracks and artists.
To grade this assignment, the program will run a query like this on your database and look for the data it expects to see:
SELECT track.title, album.title, artist.name FROM track JOIN album ON track.album_id = album.id JOIN tracktoartist ON track.id = tracktoartist.track_id JOIN artist ON tracktoartist.artist_id = artist.id ORDER BY track.title LIMIT 3;
Expected out put is this
The expected result of this query on your database is: title album artist A Boy Named Sue (live) The Legend Of Johnny Cash Jo
DROP TABLE album CASCADE; CREATE TABLE album ( id SERIAL, title VARCHAR(128) UNIQUE, PRIMARY KEY(id) ); DROP TABLE track CASCADE; CREATE TABLE track ( id SERIAL, title TEXT, artist TEXT, album TEXT, album_id INTEGER REFERENCES album(id) ON DELETE CASCADE, count INTEGER, rating INTEGER, len INTEGER, PRIMARY KEY(id) ); DROP TABLE artist CASCADE; CREATE TABLE artist ( id SERIAL, name VARCHAR(128) UNIQUE, PRIMARY KEY(id) ); DROP TABLE tracktoartist CASCADE; CREATE TABLE tracktoartist ( id SERIAL, track VARCHAR(128), track_id INTEGER REFERENCES track(id) ON DELETE CASCADE, artist VARCHAR(128), artist_id INTEGER REFERENCES artist(id) ON DELETE CASCADE, PRIMARY KEY(id) ); \copy track(title,artist,album,count,rating,len) FROM 'library.csv' WITH DELIMITER ',' CSV; INSERT INTO album (title) SELECT DISTINCT album FROM track; UPDATE track SET album_id = (SELECT album.id FROM album WHERE album.title = track.album); INSERT INTO tracktoartist (track, artist) SELECT DISTINCT ... INSERT INTO artist (name) ... UPDATE tracktoartist SET track_id = ... UPDATE tracktoartist SET artist_id = ... -- We are now done with these text fields ALTER TABLE track DROP COLUMN album; ALTER TABLE track ... ALTER TABLE tracktoartist DROP COLUMN track; ALTER TABLE tracktoartist ... SELECT track.title, album.title, artist.name FROM track JOIN album ON track.album_id = album.id JOIN tracktoartist ON track.id = tracktoartist.track_id JOIN artist ON tracktoartist.artist_id = artist.id LIMIT 3;
What am i doing wrong with the code?
-
DELETE operation javascript fetch() with REST API firebase
I use the following API request in order to delete all texts (so called cps) of one section (one section contains many cps)
await fetch(`https://12345-default-rtdb.europe- west1.firebasedatabase.app/cps/${userId}.json?section_id=${mysection}`,{method:'DELETE', // });
userId is correct, mysections is the current sectionId, section_id is the key of the sectionId in the JSON document. (eg: -N09gWdyQlV7OsPpEx7t or -N09g_HjbcFCQFBiIX0A see below) In this example all cps of all sections are being deleted. So the conditional query does not work.
What is going wrong here? Thanks!
The tree within firestore looks like this:
cps -> user1 -> -N09gWdyQlV7OsPpEx7t
cps -> user1 -> -N09g_HjbcFCQFBiIX0A
....
-
OAuth 2.0 flow using Keycloak with a back-end and front-end?
I'm working on a project that consists of:
- A back-end in Java (JEE project deployed on Wildfly)
- Front-end developed in Angular
- Keycloak for authorization and authentication
What I need to do is:
- Access my Angular app, which communicates with my backend calling its APIs(e.g GET, POST, PUT, DELETE)
- Go to the Login Page, authentication is done by Keycloak, so I get directed to Keycloak login page.
- Login is successfull ----> I get redirected to my Angular landing page, now I can navigate my Angular app.
The frontend is talking with the backend via RESTful APIs, and I need to use OAuth2.0/OPENID standard flow, which means I'm gonna first get the auth code and the the access token/refresh token to stay connected.
My backend is already configurated with Keycloak through the Wildfly adapter given on Keycloak official site, and through web.xml and keycloak.json both in WEB-INF folder.
So known that I'm able to get the auth code via the Valid Redirect URIs given by Keycloak, how can I configure my whole project to get the 3 points written above? Do I need two clients on my Realm in Keycloak?
Can someone please explain me how I can I setup the whole flow using Keycloak?
Thanks a lot!
-
REST client API data types with laravel
What form of data should I choose as input for a rest api with laravel? form-data, raw, urlencoded or others?
-
How to update product quantity in MongoDB with node?
app.put('/cycle/:cycleId', async (req, res) => { const id = req.params.cycleId; const filter = {}; const options = { upsert: true }; const updateDoc = { $set: {}, }; const result = await cycleCollection.updateOne( filter, updateDoc, options ); res.send(result); });
I have tried so many times but can't find any solution!
-
How can i update qty quantity in mongodb database?
My Collection of data
{ "_id" : 1, "item_id" : "I001", "comp_id" : "C001", "qty" : 25, "prate" : 30, "srate" : 35, "mrp" : 40 }, { "_id" : 2, "item_id" : "I001", "comp_id" : "C002", "qty" : 30, "prate" : 32, "srate" : 37, "mrp" : 40 }
How should I increment the "qty" using MongoDB? any help?
-
I am getting an error 404 using nodeJS and express
I'm having an issue with the routes in my project called review. My other routes have no issue so I'm not sure where I went wrong here. I keep getting error 404 in my frontend and in postman. I believe everything is linking to the right information. I go the route
http://localhost:8080/api/review/addReview and get the 404 error
This is my server.js
const express = require("express"); const cors = require("cors"); const dbConfig = require("./app/config/db.config"); const app = express(); var corsOptions = { origin: "http://localhost:8081" }; app.use(cors(corsOptions)); // parse requests of content-type - application/json app.use(express.json()); // parse requests of content-type - application/x-www-form-urlencoded app.use(express.urlencoded({ extended: true })); const db = require("./app/models"); // const Role = db.role; app.use('/uploads', express.static('uploads')); db.mongoose .connect(`mongodb+srv://password@cluster0.gmvao.mongodb.net/shop?retryWrites=true&w=majority`, { useNewUrlParser: true, useUnifiedTopology: true, useFindAndModify: false }) .then(() => { console.log("Successfully connect to MongoDB."); // initial(); }) .catch(err => { console.error("Connection error", err); process.exit(); }); // routes // require(".app/routes/favourite.routes")(app); require("./app/routes/auth.routes")(app); require("./app/routes/user.routes")(app); app.use('/api/admin', require('./app/routes/admin.routes')); app.use('/api/review', require('./app/routes/review.routes')); // set port, listen for requests const PORT = 8080; app.listen(PORT, () => { console.log(`Server is running on port ${PORT}.`); });
My routes file
const express = require('express'); const router = express.Router(); const {authJwt} = require("../middlewares"); const Review = require("../models/review.model") router.use(function(req, res, next) { res.header( "Access-Control-Allow-Headers", "x-access-token, Origin, Content-Type, Accept" ); next(); }); router.post("/addReview", [authJwt.verifyToken], (req, res) => { const review = new Review(req.body) review.save((err, review) => { if(err) return res.json({success:false, err}) Review.find({'_id': review._id}) .populate('author') .exec((err, result) => { if(err) return res.json({success: false, err}) return res.status(200).json({success: true, result}) }) }) }) module.exports = router ;
My review model file
const mongoose = require('mongoose'); const Review = mongoose.model( "Review", new mongoose.Schema({ prodID: String, productTitle: String, reviewId: String, content: String, author: [ { type: mongoose.Schema.Types.ObjectId, ref: "User", }, ] }) ); module.exports = Review;
-
422 Error when using httr for uploading data to batchPrediction API (data robot)
Disclaimer:
This error is related to a subscription-based tool and hence the issue is likely not going to be reproducible for any reader. Nevertheless, any suggestions on http request might still help me debug the issue faced.Objective:
I am trying to upload a CSV dataset to DataRobot'sbatch prediction
API, having deployed a model for it.Context and issue:
- DataRobot requires 2 requests to be made in order to upload a CSV dataset for scoring by the deployed model: a
POST
request, and aPUT
request - I am able to send a
POST
request successfully - However, sending the
PUT
request returns aClient error(402): Unprocessable entity (RFC4918)
- Specifically, the error states
Detected 263 extra values in row 3 than the no. of columns in header. Could be due to improperly quoted values or wrong encoding
- I checked the CSV format and the CSV intake settings (default) at the API end point - all of them match
- I even tried uploading the same CSV file manually via GUI - it works seamlessly...
Possible clues to solution?
- I kept ~10 rows with only 3 columns (original dataset has 200+) and tried uploading. This led to a successful upload although the subsequent scoring did not take place given missing columns.
- I feel maybe I am incorrectly specifying my PUT request, so any guidance is much appreciated!
library(httr) post_req <- POST(batch_api_url, add_headers(.headers = c(Authorization = "Bearer xxxxxx")), body = list(deploymentID = ID, passthroughColumnsSet = "all"), verbose()) # Works... put_req <- PUT(batch_api_url, add_headers(.headers = c(Authorization = "Bearer xxxxxx", "content-type" = "text/csv; encoding=utf-8", "content-size" = filesize)), body = list(input_file = upload_file(filepath, "text/csv"), to_json = F), encode = "multipart", verbose()) # 422 error if I use all 200+ columns; but succeeds in uploading if I use a reduced dataset (3-4 columns only)
- DataRobot requires 2 requests to be made in order to upload a CSV dataset for scoring by the deployed model: a
-
How can I put editing of local storage on t mongodb server side?
How can I put the code of local storage on everyone's side and show the website ?
-
How do I update a record using the put API?
As you can see in the React Js code I am trying the Reduce one by one quantity when clicking the Delivered button. When the Click Delivered button quantity becomes null.
const reduceQuantity=(productId)=>{ const quantityReduce=parseInt(product.quantity )- 1 const url = `http://localhost:5000/product/${productId}` fetch(url, { method: "PUT", headers: { "Content-Type": "application/json", }, body: JSON.stringify({ quantityReduce }), }) .then((res) => res.json()) .then((data) => { console.log('success', data) }); return ( <div className='item-container'> <div className='text-center'> <img src={product.picture} alt="" /> </div> <h6><b>{product.name}</b></h6> <p className='text-justify'><b>Description:</b>{product.Description}</p> <div className='d-flex'> <div> <p><b>Price:</b>{product.price}</p> <p><b>supplierName:</b>{product.supplierName}</p> </div> <p><b>Quantity:</b>{product.quantity}</p> </div> </div>`enter code here` <div className='text-center d-flex justify-content-around'> <button className='btn btn-outline-success rounded-pill' >Delivered</button> server Side API: // update quantity of products app.put("/product/:id", async (req, res) => { const id = req.params.id; const data = req.body; console.log("from update api", data); const filter = { _id: ObjectId(id) }; const options = { upsert: true }; const updateDoc = { $set: { quantity: data.addQuantity }, };