Denver Redlining¶
Introduction¶
Redlining is a discriminatory practice in which financial services are systematically withheld from neighborhoods with significant racial and ethnic minority populations. This practice has been most prominent in the United States, predominantly targeting African-American communities. Common examples of redlining include the denial of credit, insurance, healthcare services, and the creation of food deserts in minority neighborhoods.
Redlining in the U.S. emerged within a broader context of racial segregation and discrimination, shaped by economic theories on race and property values promoted by Richard T. Ely’s Institute for Research in Land Economics. The National Housing Act of 1934 and the establishment of the Federal Housing Administration (FHA) institutionalized this practice, under the leadership of Homer Hoyt, who developed the first mortgage underwriting criteria. This federal policy accelerated the decline of minority neighborhoods by restricting access to mortgage capital, exacerbating residential segregation and contributing to urban decay.
Redlining in Denver, which began in the 1930s, was a discriminatory practice that systematically excluded minority communities, particularly African Americans, from access to financial services and housing opportunities. The Federal Housing Administration (FHA) created maps that marked neighborhoods deemed "high-risk" in red, discouraging banks from issuing mortgage loans in these areas.
One notable example is the Five Points neighborhood, which, despite the restrictions imposed by redlining, emerged as a vibrant cultural hub known as the "Harlem of the West." From the 1920s to the 1960s, Five Points thrived with a rich jazz scene and an active African American business community.
Although redlining was officially banned in 1968, its effects persisted, contributing to residential segregation and a lack of investment in certain neighborhoods. In recent years, Denver has implemented programs to address these historical inequalities. For example, in 2022, the city launched an initiative offering financial assistance for down payments to individuals affected by redlining between 1938 and 2000, aiming to promote equity in homeownership.
Additionally, studies have shown that areas historically impacted by redlining in Denver continue to face significant challenges in terms of health equity and economic opportunities, reflecting the long-term consequences of these discriminatory practices.
In summary, redlining in Denver had a profound and lasting impact on the city’s development, perpetuating segregation and limiting opportunities for minority communities. Current efforts seek to acknowledge and remedy these historical injustices to build a more inclusive and equitable city.
Data Citation¶
- "Redlining: How the Five Points Neighborhood Formed Amid Racist Practices in the 1930s"
- "Redlined in Denver Between 1938 and 2000? The City Might Help You Put a Down Payment on a New Home"
- "Denver’s Public Health and Environment Equity Report (2020)"
These sources detail the historical impact of redlining in Denver, the specific case of Five Points, and current efforts to address the effects of this discriminatory practice.
Description of the Resources¶
A multispectral image will be imported, product is HLSL30, we will use the green band.
Multispectral image: A set of images taken in different bands of the electromagnetic spectrum (e.g., visible bands such as red, green, blue and others in the near infrared). The revisit days is 8 of the sensor. This product provides images with a spatial resolution of 30 meters, adjusted to correct for bidirectional reflectance distribution function (BRDF) effects, allowing global observations of the Earth's surface. We will use the reflectance of the product.
HLSL30” product: Is a identifier of a dataset, sensor, processed image, or geospatial product generated by remote sensing or satellite image analysis.
Green band: The spectral channel corresponding to wavelengths associated with the color green, often used in remote sensing to assess vegetation, photosynthesis, and crop health. Spectral channel (Green): 0.53 – 0.59 µm.
Import Libraries
%store -r denver_redlining_gdf data_dir
import re # Use regular expressions to extract metadata
import earthaccess # Access NASA data from the cloud
import matplotlib.pyplot as plt #Overlay raster and vector data
import numpy as np # Process bit-wise cloud mask
import pandas as pd # Group and aggregate
import rioxarray as rxr # Work with raster data
from rioxarray.merge import merge_arrays # Merge rasters
# Search earthaccess
earthaccess.login(strategy="interactive", persist=True)
denver_results = earthaccess.search_data(
short_name="HLSL30",
bounding_box=tuple(denver_redlining_gdf.total_bounds),
temporal=("2023-07-12"),
count=1
)
denver_files = earthaccess.open(denver_results)
QUEUEING TASKS | : 0%| | 0/15 [00:00<?, ?it/s]
PROCESSING TASKS | : 0%| | 0/15 [00:00<?, ?it/s]
COLLECTING RESULTS | : 0%| | 0/15 [00:00<?, ?it/s]
def process_image(uri, bounds_gdf):
"""
Load, crop, and scale a raster image from earthaccess
Parameters
----------
uri: file-like or path-like
File accessor downloaded or obtained from earthaccess
bounds_gdf: gpd.GeoDataFrame
Area of interest to crop to
Returns
-------
cropped_da: rxr.DataArray
Processed raster
"""
#Connet to the raster image
da = rxr.open_rasterio(uri, mask_and_scale=True).squeeze()
#Get the study bounds
bounds = (
bounds_gdf
.to_crs(da.rio.crs)
.total_bounds
)
# Crop
cropped_da = da.rio.clip_box(*bounds)
return cropped_da
process_image(denver_files[8], denver_redlining_gdf).plot()
<matplotlib.collections.QuadMesh at 0x20447652dd0>
def process_cloud_mask(cloud_uri, bounds_gdf, bits_to_mask):
"""
Load an 8-bit Fmask file and process to a boolean mask
Parameters
----------
uri: file-like or path-like
Fmask file accessor downloaded or obtained from earthaccess
bounds_gdf: gpd.GeoDataFrame
Area of interest to crop to
bits_to_mask: list of int
The indices of the bits to mask if set
Returns
-------
cloud_mask: np.array
Cloud mask
"""
#Open fmask file
fmask_da = process_image(cloud_uri, bounds_gdf)
#unpack the cloud mask bits
cloud_bits = (
np.unpackbits(
(
# Get the cloud mask as an array...
fmask_da.values
# ... of 8-bit integers
.astype('uint8')
# With an extra axis to unpack the bits into
[:, :, np.newaxis]
),
# List the least significat bit first to match the user guide
bitorder='little',
# Expand the array in a new dimension
axis=-1)
)
cloud_mask = np.sum(
# Select bits
cloud_bits[:,:,bits_to_mask],
# Sum along the bit axis
axis=-1
# Check if any of bits are true
) == 0
return cloud_mask
blue_da = process_image(denver_files[1], denver_redlining_gdf)
denver_cloud_mask = process_cloud_mask(
denver_files[-1],
denver_redlining_gdf,
[1, 2, 3, 5])
#blue_da.where(denver_cloud_mask).plot()
# Search earthaccess
earthaccess.login(strategy="interactive", persist=True)
denver_results = earthaccess.search_data(
short_name="HLSL30",
bounding_box=tuple(denver_redlining_gdf.total_bounds),
temporal=("2023-07-12"),
)
# Open earthaccess results
denver_files = earthaccess.open(denver_results)
QUEUEING TASKS | : 0%| | 0/8070 [00:00<?, ?it/s]
PROCESSING TASKS | : 0%| | 0/8070 [00:00<?, ?it/s]
COLLECTING RESULTS | : 0%| | 0/8070 [00:00<?, ?it/s]
denver_files[8].full_name
'https://data.lpdaac.earthdatacloud.nasa.gov/lp-prod-protected/HLSL30.020/HLS.L30.T13SDD.2023193T173653.v2.0/HLS.L30.T13SDD.2023193T173653.v2.0.B03.tif'
# Compile a regular expression to search for metadata
uri_re = re.compile(
r"HLS\.L30\.(?P<tile_id>T[0-9A-Z]+)\.(?P<date>\d+)T\d+\.v2\.0\."
r"(?P<band_id>.+)\.tif"
)
# Find all the metadata in the file name
uri_groups = [
uri_re.search(denver_file.full_name).groupdict()
for denver_file in denver_files]
# Create a DataFrame with the metadata
raster_df = pd.DataFrame(uri_groups)
# Add the File-like URI to the DataFrame
raster_df['file'] = denver_files
# Check the results
#raster_df
# Labels for each band to process
bands = {
'B02': 'red',
'B03': 'green',
'B04': 'blue',
'B05': 'nir'
}
# Initialize structure for saving images
denver_das = {band_name: [] for band_name in bands.values()}
for tile_id, tile_df in raster_df.groupby('tile_id'):
# Load the cloud mask
fmask_file = tile_df[tile_df.band_id=='Fmask'].file.values[0]
cloud_mask = process_cloud_mask(
fmask_file,
denver_redlining_gdf,
[1, 2, 3, 5])
for band_id, row in tile_df.groupby('band_id'):
if band_id in bands:
band_name = bands[band_id]
print(band_id, band_name)
# Process band
band_da = process_image(row.file.values[0], denver_redlining_gdf)
# Mask band
band_masked_da = band_da.where(cloud_mask)
# Store the resulting DataArray ofr later
denver_das[band_name].append(band_masked_da)
break
B02 red B03 green B04 blue B05 nir
# Merge all tiles
denver_merged_das = {
band_name: merge_arrays(das)
for band_name, das
in denver_das.items()}
# Plot a merged raster band
denver_merged_das['green'].plot(cmap='Greens', robust=True)
<matplotlib.collections.QuadMesh at 0x2044e2d6750>