• support[@]kurinchilion.com
  • +1 (888) 666-4252

Mac Terminal: How to hide the message “the default interactive shell is now zsh”

Dec 01, 2022 - by kurinchilamp // 371 Views
To remove the message "the default interactive shell is now zsh", edit the .bash_profile (more…)
Continue Reading

Git sparse checkout or partial checkout

Sep 21, 2022 - by kurinchilamp // 345 Views
  There are scenarios when you are contributing to a subset of a very large project. Instead of downloading the entire project, you may want to bring down a copy of ONLY the needed folders or files (to save download time, space ...) Git allows partial cloning which allows Git to function without having a complete copy of the repository. Follow the below steps to try out Git's sparse-checkout feature. (more…)
Continue Reading

Linux: Copy files and folders including hidden files to another folder

Jan 26, 2022 - by kurinchilamp /Linux Server/ 286 Views
Simple copy command to copy ALL files and folders including the hidden files from a source folder to a destination folder $cp -RTf /home/sourcefolder/. /home/destinationfolder/
Continue Reading

Ionic: Newer version doesn’t show up after global install

Apr 18, 2019 - by kurinchilamp // 305 Views
You can install the latest version of Ionic and Cordova CLI using a single command $ npm install ionic cordova -g Still, you may get a notification message saying "Please install your Cordova CLI to version >=4.2.0 `npm install -g cordova`" Or, incorrect version numbers for ionic and cordova even after installing their latest versions globally. (more…)
Continue Reading

Express NodeJS – How to redirect HTTP to HTTPS?

Feb 08, 2019 - by kurinchilamp // 284 Views
In order to redirect HTTP to HTTPS in express server, try adding the following lines to your code base. app.set('trust proxy', 1); app.use(function(req, res, next) { if (! req.secure){ res.redirect("https://" + req.headers.host + req.url); } return next(); });
Continue Reading

How to avoid search engines from crawling your website?

Apr 14, 2015 - by kurinchilamp /Linux Server/ 306 Views
Your answer is to create a robots.txt file in the root of your web directory and to have the code setting given below in the file.
User-agent: * Disallow: /
You can read more about Robots exclusion protocol, here
Continue Reading

TECHNOLOGY DEV STACK

Following are some of the technologies that we use to build and maintain solutions for our clients.