{ "cells": [ { "cell_type": "markdown", "metadata": {}, "source": [ "\n", "*This notebook contains material from [cbe67701-uncertainty-quantification](https://ndcbe.github.io/cbe67701-uncertainty-quantification);\n", "content is available [on Github](https://github.com/ndcbe/cbe67701-uncertainty-quantification.git).*" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "\n", "< [7.1 Latin Hypercube and Quasi-Monte Carlo Sampling](https://ndcbe.github.io/cbe67701-uncertainty-quantification/07.01-Sampling-Based-Uncertainty-Quantification.html) | [Contents](toc.html) | [7.3 Meaningful Title Goes Here](https://ndcbe.github.io/cbe67701-uncertainty-quantification/07.03-Contributed-Example.html)
"
]
},
{
"cell_type": "markdown",
"metadata": {
"colab_type": "text",
"id": "DnCs7XId4TeP",
"nbpages": {
"level": 1,
"link": "[7.2 Latin Hypercube Sampling](https://ndcbe.github.io/cbe67701-uncertainty-quantification/07.02-Latin-Hypercube-sampling.html#7.2-Latin-Hypercube-Sampling)",
"section": "7.2 Latin Hypercube Sampling"
}
},
"source": [
"# 7.2 Latin Hypercube Sampling\n"
]
},
{
"cell_type": "markdown",
"metadata": {
"colab_type": "text",
"id": "5ZETasp64TeQ",
"nbpages": {
"level": 1,
"link": "[7.2 Latin Hypercube Sampling](https://ndcbe.github.io/cbe67701-uncertainty-quantification/07.02-Latin-Hypercube-sampling.html#7.2-Latin-Hypercube-Sampling)",
"section": "7.2 Latin Hypercube Sampling"
}
},
"source": [
"Created by V.Vijay Kumar Naidu (vvelagal@nd.edu)"
]
},
{
"cell_type": "markdown",
"metadata": {
"nbpages": {
"level": 1,
"link": "[7.2 Latin Hypercube Sampling](https://ndcbe.github.io/cbe67701-uncertainty-quantification/07.02-Latin-Hypercube-sampling.html#7.2-Latin-Hypercube-Sampling)",
"section": "7.2 Latin Hypercube Sampling"
}
},
"source": [
"The text, examples, and codes in this notebook were adapted from the following references:\n",
"* https://en.wikipedia.org/wiki/Latin_hypercube_sampling\n",
"* McClarren, Ryan G (2018). *Uncertainty Quantification and Predictive Computational Science: A Foundation for Physical Scientists and Engineers, Chapter 7 : Sampling-Based Uncertainty Quantification Monte Carlo and Beyond*, Springer, https://link.springer.com/chapter/10.1007%2F978-3-319-99525-0_7"
]
},
{
"cell_type": "code",
"execution_count": 1,
"metadata": {
"colab": {},
"colab_type": "code",
"id": "kQhVsePNSvu8",
"nbpages": {
"level": 1,
"link": "[7.2 Latin Hypercube Sampling](https://ndcbe.github.io/cbe67701-uncertainty-quantification/07.02-Latin-Hypercube-sampling.html#7.2-Latin-Hypercube-Sampling)",
"section": "7.2 Latin Hypercube Sampling"
}
},
"outputs": [],
"source": [
"# Install Python libraries\n",
"!pip install -q sobol_seq\n",
"!pip install -q ghalton\n",
"!pip install -q pyDOE"
]
},
{
"cell_type": "code",
"execution_count": 2,
"metadata": {
"colab": {},
"colab_type": "code",
"id": "KffZ1fJ_4TeQ",
"nbpages": {
"level": 1,
"link": "[7.2 Latin Hypercube Sampling](https://ndcbe.github.io/cbe67701-uncertainty-quantification/07.02-Latin-Hypercube-sampling.html#7.2-Latin-Hypercube-Sampling)",
"section": "7.2 Latin Hypercube Sampling"
}
},
"outputs": [],
"source": [
"# Import\n",
"import matplotlib.pyplot as plt\n",
"import numpy as np\n",
"import scipy.sparse as sparse\n",
"import scipy.sparse.linalg as linalg\n",
"import scipy.integrate as integrate\n",
"import math\n",
"from scipy.stats.distributions import norm\n",
"from scipy.stats import gamma\n",
"import sobol_seq\n",
"import ghalton\n",
"from scipy import stats\n",
"from pyDOE import *\n",
"%matplotlib inline"
]
},
{
"cell_type": "code",
"execution_count": 3,
"metadata": {
"nbpages": {
"level": 1,
"link": "[7.2 Latin Hypercube Sampling](https://ndcbe.github.io/cbe67701-uncertainty-quantification/07.02-Latin-Hypercube-sampling.html#7.2-Latin-Hypercube-Sampling)",
"section": "7.2 Latin Hypercube Sampling"
}
},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"Checking for figures/lhs_custom_distribution.png\n",
"\tFile found!\n",
"Checking for figures/chapter7-screenshot.PNG\n",
"\tFile found!\n"
]
}
],
"source": [
"# Download figures (if needed)\n",
"import os, requests, urllib\n",
"\n",
"# GitHub pages url\n",
"url = \"https://ndcbe.github.io/cbe67701-uncertainty-quantification/\"\n",
"\n",
"# relative file paths to download\n",
"# this is the only line of code you need to change\n",
"file_paths = ['figures/lhs_custom_distribution.png', 'figures/chapter7-screenshot.PNG']\n",
"\n",
"# loop over all files to download\n",
"for file_path in file_paths:\n",
" print(\"Checking for\",file_path)\n",
" # split each file_path into a folder and filename\n",
" stem, filename = os.path.split(file_path)\n",
" \n",
" # check if the folder name is not empty\n",
" if stem:\n",
" # check if the folder exists\n",
" if not os.path.exists(stem):\n",
" print(\"\\tCreating folder\",stem)\n",
" # if the folder does not exist, create it\n",
" os.mkdir(stem)\n",
" # if the file does not exist, create it by downloading from GitHub pages\n",
" if not os.path.isfile(file_path):\n",
" file_url = urllib.parse.urljoin(url,\n",
" urllib.request.pathname2url(file_path))\n",
" print(\"\\tDownloading\",file_url)\n",
" with open(file_path, 'wb') as f:\n",
" f.write(requests.get(file_url).content)\n",
" else:\n",
" print(\"\\tFile found!\")"
]
},
{
"cell_type": "markdown",
"metadata": {
"nbpages": {
"level": 2,
"link": "[7.2.1 Latin Hypercube Basics](https://ndcbe.github.io/cbe67701-uncertainty-quantification/07.02-Latin-Hypercube-sampling.html#7.2.1-Latin-Hypercube-Basics)",
"section": "7.2.1 Latin Hypercube Basics"
}
},
"source": [
"## 7.2.1 Latin Hypercube Basics\n",
"\n",
"Latin hypercube sampling (LHS) is a statistical method for generating a near random samples with equal intervals.\n",
"\n",
"To generalize the Latin square to a hypercube, we define a X = (X1, . . . , Xp) as a collection of p independent random variables. To generate N samples, we divide the domain of each Xj in N intervals. In total there are Np such intervals. The intervals are defined by the N + 1 edges:\n"
]
},
{
"cell_type": "markdown",
"metadata": {
"colab_type": "text",
"id": "7KFZKceKHWVX",
"nbpages": {
"level": 3,
"link": "[7.2.1.1 Latin Hypercube in 2D](https://ndcbe.github.io/cbe67701-uncertainty-quantification/07.02-Latin-Hypercube-sampling.html#7.2.1.1-Latin-Hypercube-in-2D)",
"section": "7.2.1.1 Latin Hypercube in 2D"
}
},
"source": [
"### 7.2.1.1 Latin Hypercube in 2D\n",
"\n",
"Makes a Latin Hyper Cube sample and returns a matrix X of size n by 2. For each column of X, the n values are randomly distributed with one from each interval (0,1/n), (1/n,2/n), ..., (1-1/n,1) and they are randomly permuted."
]
},
{
"cell_type": "code",
"execution_count": 4,
"metadata": {
"colab": {},
"colab_type": "code",
"id": "30R2uAAp6aiR",
"nbpages": {
"level": 3,
"link": "[7.2.1.1 Latin Hypercube in 2D](https://ndcbe.github.io/cbe67701-uncertainty-quantification/07.02-Latin-Hypercube-sampling.html#7.2.1.1-Latin-Hypercube-in-2D)",
"section": "7.2.1.1 Latin Hypercube in 2D"
}
},
"outputs": [],
"source": [
"def latin_hypercube_2d_uniform(n):\n",
" lower_limits=np.arange(0,n)/n\n",
" upper_limits=np.arange(1,n+1)/n\n",
"\n",
" points=np.random.uniform(low=lower_limits,high=upper_limits,size=[2,n]).T\n",
" np.random.shuffle(points[:,1])\n",
" return points"
]
},
{
"cell_type": "code",
"execution_count": 5,
"metadata": {
"colab": {
"base_uri": "https://localhost:8080/",
"height": 341
},
"colab_type": "code",
"id": "lD6llSHp8nUH",
"nbpages": {
"level": 3,
"link": "[7.2.1.1 Latin Hypercube in 2D](https://ndcbe.github.io/cbe67701-uncertainty-quantification/07.02-Latin-Hypercube-sampling.html#7.2.1.1-Latin-Hypercube-in-2D)",
"section": "7.2.1.1 Latin Hypercube in 2D"
},
"outputId": "a245fe0e-5436-4fb4-f237-8651f745b45d"
},
"outputs": [
{
"data": {
"text/plain": [
"
"
]
}
],
"metadata": {
"colab": {
"name": "07.02-Latin-Hypercube-sampling.ipynb",
"provenance": []
},
"kernelspec": {
"display_name": "Python 3",
"language": "python",
"name": "python3"
},
"language_info": {
"codemirror_mode": {
"name": "ipython",
"version": 3
},
"file_extension": ".py",
"mimetype": "text/x-python",
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.7.3"
}
},
"nbformat": 4,
"nbformat_minor": 1
}