where there is no match between the rides and companies tables. PySpark. The Data Lake will have no history, i.e., it will overwrite every time from the source system, which means that the source systems preserve history. But opting out of some of these cookies may affect your browsing experience. In this section, we we will rename our PySpark DataFramess existing column using the .withColumnRenamed() method. document.getElementById( "ak_js_1" ).setAttribute( "value", ( new Date() ).getTime() ); Understand Random Forest Algorithms With Examples (Updated 2023). Here, will have given the name to our Application by passing a string to .appName() as an argument. Is it better to not connect a refrigerator to water supply to prevent mold and water leaks. that executes writes to a Parquet table: Now that you have added the libraries and all three functions to your notebook, Reductive instead of oxidative based metabolism. In this article, we will perform and understand a basic operation of dropping single and multiple columns from a PySpark data frame. Example 3: Convert the data type of grade column from float to int. Foreign Key relationships need to be established. How To Split A Column or Column Names in Pandas and Get Part of it? How to change dataframe column names in PySpark? Specifically, we are going to explore how to do so using: selectExpr () method withColumnRenamed () method toDF () method alias Spark Session and Spark SQL and rename one or more columns at a time. This works on columns with one or more aliases as well as unaliased columns. in identical file and folder format. mismatching hash keys: After the Python code execution, the rides table will have the following metadata: The rides delta table, id_company column, will be set to "-1", where using Spark SQL only command called VACUUM. To rename multiple columns, you can chain multiple withColumnRenamed functions. We have covered 3 different methods for creating new columns in a data frame conditionally. First, let's create an example PySpark DataFrame that we'll reference throughout this guide to demonstrate a few concepts. Data should be augmented with surrogate keys. Option 4. Does Scaramanga's flying car reference Chitty-Chitty-Bang-Bang? To select one or more columns of PySpark DataFrame, we will use the .select() method. After this we passed a dictionary dataset consisting of information related to different electronic gadgets. Site design / logo 2023 Stack Exchange Inc; user contributions licensed under CC BY-SA. Even if we pass the same column twice, the .show() method would display the column twice. must design your own relationship (foreign keys) management in Spark using Python PySpark DataFrame is built over Sparks core data structure, Resilient Distributed Dataset (RDD). Python | Pandas Series.astype() to convert Data type of series, Add a Pandas series to another Pandas series, Get the data type of column in Pandas - Python, Convert the data type of Pandas column to int, Change Data Type for one or more columns in Pandas Dataframe, Select a single column of data as a Series in Pandas. hand, Parquet is compressed columnar storage that resembles the characteristics Let us continue with the same updated DataFrame from the last step with renamed Column of Weights of Fishes in Kilograms. You can provide your valuable feedback to me on LinkedIn. Figure 4: Result of Successful Python Returns. dim_company and generate dim_date using either Python code examples or SQL code As an example, we will use our taxi rides and company table and perform aggregation In this tip, I will be hashing the business key columns and then looking The following method can allow you rename columns of multiple files, Reference: https://www.linkedin.com/pulse/pyspark-methods-rename-columns-kyle-gibson/. Here we discuss the introduction, working of alias in PySpark and examples for better understanding. 577), Self-healing code is the future of software development, We are graduating the updated button styling for vote arrows, Statement from SO: June 5, 2023 Moderator Action. The alias function can be used as a substitute for the column or table in PySpark which can be further used to access all its properties. data warehouse platforms. Re-training the entire time series after cross-validation? Analytics or AWS Glue. If I did not miss anything, this seems to fit all scenarios in which one would want to use aliases, unless you are trying to alias some columns to allow something like column-duck-typing. architecture for the design of Data Lakes and apply it to specific customer cases If you get an error importing quinn library. In this article, we performed a basic operation of dropping single and multiple columns from a PySpark data frame. the following Python libraries: Here is the function for reading all Parquet files One key difference between Option 3. using See my answer for a solution that can programatically rename columns. Is 'infodumping' the important parts of a story via an in-universe lesson in school/documentary/the news/other educational medium bad storytelling? The function just gives a new name as the reference that can be used further for the data frame in PySpark. Figure 1: Medallion Architecture with 4 Layers. PFB a few approaches to do the same. Between Rides and Companies. Now let's try to rename col_1 to col_3. Are you looking to find how to rename the column of a PySpark DataFrame in an Azure Databricks cloud or maybe you are looking for a solution, to change the DataFrame existing column name into a new column name in PySpark Databricks using the withColumnRenamed () method? Site design / logo 2023 Stack Exchange Inc; user contributions licensed under CC BY-SA. Here, we used the * operator to drop Device price and Warranty columns from the data frame. Method 1: Using DataFrame.astype () method. Save my name, email, and website in this browser for the next time I comment. To create an alias of a column, we will use the .alias() method. A SparkSession acts as an entry point to access the spark APIs. Can the Wildfire Druid ability Blazing Revival prevent Instant Death due to massive damage or disintegrate? In addition to the three layers, a fourth I will be using By using our site, you How To Change Column Names and Row Indexes in Pandas? Replace coefficients with RootApproximant of themselves. taxi data: Delta Lake Files Maintenance by VACUUM. We create a dictionary and specify the column name with the desired data type. df.rename(columns={"by_week": "Last 7 days", "by_month": "Last 30 days"}).to_html(). Is there a better and more efficient way to do this like we do in pandas? We can alias more as a derived name for a Table or column in a PySpark Data frame / Data set. You Filename Columns. sort ("department","state"). Deliver faster, more efficient data streaming capability below. This rules out column names containing spaces or special characters and column names that start with an integer. We have to parse this out from the string representation. In short, Medallion architecture requires splitting the Data Lake into three You can also use the select function along with the alias function to rename columns while selecting them. Resilient Distributed Dataset is a low-level object that allows Spark to work by dividing data into multiple cluster nodes. you intend to use to the silver layer to avoid complexity run-away that may result To subscribe to this RSS feed, copy and paste this URL into your RSS reader. What is the best way to set up multiple operating systems on a retro PC? Let us continue with the same updated DataFrame from the last step with an additional Column of Weights of Fishes in Kilograms. The aliasing function can be used to change a column name in the existing data frame also. The PySpark data frame also consists of rows and columns but the processing part is different as it uses in-system (RAM) computational techniques for processing the data. Comments | Related: > Azure. In this blog post, we have explored various techniques for renaming columns in PySpark DataFrames. id_company, and a second table called rides, including reference to the company You can also use select(), one disadvantage of using this is you need to provide all column names you wanted on the resulting DataFrame. Depending on the flexibility you need or the complexity of the task, you can choose the . them up against other tables. The entire data under this column will be removed from the data frame and the processing unit. In that case, you won't want to manually run. Next, we used .getOrCreate() which will create and instantiate SparkSession into our object spark. to support row-based access but does not offer the best compression. They are just like a Temporary name. of a clustered column store index in a traditional data warehouse. examples of implementing Databricks solutions in this tip: Note: I acknowledge that you have read and understood our. Below example renames column name to sum_salary. dim_company: Figure 11: Fact_ride Transformation Login details for this Free course will be emailed to you. For Python3, replace xrange with range. The first argument in withColumnRenamed is the old column name. Given below shows the working of alias in PySpark: Lets start by creating simple data in PySpark. The Alias gives a new name for the certain column and table and the property can be used out of it. The media shown in this article are not owned by Analytics Vidhya and are used at the Authors discretion. Here, we used the .withColumn() method. Syntax: Dataframe/Series.apply(func, convert_dtype=True, args=()). What does it mean that an integrator has an infinite DC gain? Now, changing the dataframe data types to string. Does anyone know which story of One Thousand and One Nights the following artwork from Lon Carr illustrates? will be set to "-2". Use example code below: To subscribe to this RSS feed, copy and paste this URL into your RSS reader. companies, containing the taxi company name and corresponding SHA2 hash key in column operator. Bronze Layer. Everything you need to Know about Linear Regression! The Alias was issued to change the name of the column ID to a new Name New_Id. table and additional information about every ride, like fare, date time, and more: Figure 8: One-to-Many Relationship Return: Dataframe/Series after applied function/operation. Then we will create a new dataframe by importing a CSV file. the origins of data. structure: Figure 6: Weather Data Transformation Bronze Layer. Below example renames column name to sum_salary. Furthermore, by using Python, has a Python style guide, the PEP8, to use as a base. We will use the same function to execute this operation. The Alias function can be used in case of certain joins where there be a condition of self-join of dealing with more tables or columns in a Data frame. Does anyone know which story of One Thousand and One Nights the following artwork from Lon Carr illustrates? Duped/misled about safety of worksite, manager still unresponsive to my safety concerns. Browse other questions tagged, Where developers & technologists share private knowledge with coworkers, Reach developers & technologists worldwide. Another best approach would be to use PySpark DataFrame withColumnRenamed() operation to alias/rename a column of groupBy() result. This Browse other questions tagged, Where developers & technologists share private knowledge with coworkers, Reach developers & technologists worldwide, MosaicML: Deep learning models for sale, all shapes and sizes (Ep. As I already suggested here another possible solution to effectively mimic aliasing columns without actually aliasing them is this:. table: Finally, your file and folder transformations for weather data yield the following called deletedFileRetentionDuration is overdue. Create a multi-dimensional cube for the current DataFrame using the specified columns, so we can run aggregations on them. is an excellent tool for modern Data Lake. the same operations in various ways; this is almost the opposite of SQL. We can pass pandas.to_numeric, pandas.to_datetime and pandas.to_timedelta as argument to apply() function to change the datatype of one or more columns to numeric, datetime and timedelta respectively. adhere to the following data design principles: In addition, business-critical logic is to be implemented in the silver layer. If you need to rename a specific column you can use the df.rename () function and refer the columns to be renamed. Start Your Free Software Development Course, Web development, programming languages, Software testing & others. I will use the following Python libraries for the silver layer transformation: I will reuse the read_files() function from the bronze layer transformation. Connect and share knowledge within a single location that is structured and easy to search. For Example, Consider following Spark SQL example that uses an alias to rename DataFrame column names. It will contain raw copies of data "as-is" from The select method returns a PySpark DataFrame, thus we have used the .show() method at the end to display the PySpark DataFrame. data.withColumnRenamed(oldColumns[idx], newColumns[idx]) vs data.withColumnRenamed(columnname, new columnname) i think it depends on which version of pyspark your using. The Data Lake will have no history, i.e., it called "0-landingzone": Figure 2: Landing Zone Folder Structure for Taxi Data. from this tip: this solution is the closest to df.columns = new_column_name_list per the OP, both in how concise it is and its execution. The more fields you bring over from bronze to silver, the harder Is 'infodumping' the important parts of a story via an in-universe lesson in school/documentary/the news/other educational medium bad storytelling? While operating with join the aliasing can be used to join the column based on Table column operation. This is the least flexible. We can pass any Python, Numpy or Pandas datatype to change all columns of a dataframe to that type, or we can pass a dictionary having column names as keys and datatype as values to change type of selected columns. Updated: 2023-06-02 | Why is C++20's `std::popcount` restricted to unsigned types? and Green taxi trips and uploaded it to my Azure Storage Account Blob Container Most of the methods work similar to Pandas DataFrame and have method names similar to those used in SQL. Notify me of follow-up comments by email. In this blog post, we will provide a comprehensive overview of different techniques to alias columns in PySpark DataFrames, from simple renaming operations to more advanced techniques like renaming columns based on conditions or using SQL expressions. THE CERTIFICATION NAMES ARE THE TRADEMARKS OF THEIR RESPECTIVE OWNERS. My Spark version is 1.5.0 python apache-spark pyspark apache-spark-sql rename Share Improve this question Solution. PySpark DataFrame class provides sort () function to sort on one or more columns. Check out my other Articles Here and on Medium. Now, we convert the data type of grade column from float to int. To code the example, we will transform the taxi data into two tables: one called Firstly, we will create a reference data . This code will add two additional columns: xyz_bronze_timestamp and 1. Slanted Brown Rectangles on Aircraft Carriers? We will start from the basic installation of necessary libraries, creation of SparkSession and would explore the various method involved in PySpark DataFrame Column operations. A bit of overkill for what I needed though. It allows source system abstraction using Very useful when joining tables with duplicate column names. This tip provides an example of data lake architecture designed for a sub 100GB yellow and green company and bring it over to the bronze layer: Once the code has been executed, you should see the following output. Now our dataframe's names are all in lower case. In this post, we will focus on another important operation column aliasing. Run the print schema command on the weather DataFrame to check that the bronze database for Power Apps. to conform with the folder and file structure in the bronze layer. main areas: Bronze, Silver, and Gold. These cookies will be stored in your browser only with your consent. Generally, we require a schema for generating the data frame but the dataset alone can also structure the PySpark data frame. Copyright TUTORIALS POINT (INDIA) PRIVATE LIMITED. how to get curved reflections on flat surfaces? I would like to rename sum(salary) to sum_salary. If hashing fails to return the result, the key value @Quetzalcoatl This command appears to change only the specified column while maintaining all other columns. If you want to do some computation and rename the new values. We can do this by using alias after groupBy (). To learn more, see our tips on writing great answers. Alias Function to cover it over the data frame. I would choose One such API is known as PySpark which was developed for the python environment. Here, the Device name and Store name columns are removed from the data frame with the help of list comprehension. *Please provide your correct email id. You can only reference columns that are valid to be accessed using the . In this section, we will see how to select columns in PySpark DataFrame. The generated pandas data frame serves as the reference data structure for the PySpark data frame. The silver layer resembles Source systems provide the column name is used to access the particular column of a table, in the same way, the alias name as A.columname can be used for the same purpose in PySpark SQL function. How To Change Column Names and Row Indexes in Pandas? sort ( col ("department"), col ("state")). By clicking Accept all cookies, you agree Stack Exchange can store cookies on your device and disclose information in accordance with our Cookie Policy. area called the Landing Zone is needed. From various example and classification, we tried to understand how the Alias method works in PySpark and what are is used at the programming level. Thank you for your valuable feedback! For any suggestions or article requests, you can email me here. We can use the drop() function to remove single as well as multiple columns from the data frame. Use the one that fits your need. The simplest solution is using withColumnRenamed: And if you would like to do this like we do with Pandas, you can use toDF: Create an order of list of new columns and pass it to toDF. An example is illustrated below: Figure 13: Example of VACUUM Command with Azure Synapse Analytics from pyspark.sql import Column def get_column_name (col: Column) -> str: """ PySpark doesn't allow you to directly access the column name with respect to aliases from an unbound column. Lets see the program to change the data type of column or a Series in Pandas Dataframe.Method 1: Using DataFrame.astype() method. You may want to use the same silver layer data in different perspectives called show ( truncate =False) df. For this tip, I will PySpark Alias is a function in PySpark that is used to make a special signature for a column or table that is more often readable and shorter. The .select() method takes any number of arguments, each of them as Column names passed as strings separated by commas. Why do secured bonds have less default risk than unsecured bonds? Use sum() SQL function to perform summary aggregation that returns a Column type, and use alias() of Column type to rename a DataFrame column. You can find more details on medallion architecture in this tip: We used select() method to lock all the columns except Device name and Store name. A_1 A_2 B_1 B_2 s_ID 0 1 1 0.310886 -1.105775 a_id 1 1 2 -0.580864 -2.012918 b_id 2 3 3 -0.153627 -0.013282 c_id 3 2 4 -1.532841 -0.757318 d_id We can get the names of the columns as a list from pandas dataframe using in the image below: Figure 5: Bronze Layer File Transformation. From simple renaming operations to more advanced techniques like renaming columns based on conditions or using SQL expressions, these techniques enable you to give meaningful names to your columns and standardize column names across datasets easily and efficiently. This website or its third-party tools use cookies, which are necessary to its functioning and required to achieve the purposes illustrated in the cookie policy. To rename a single column in a DataFrame, you can use the withColumnRenamed function. You will be notified via email once the article is available for improvement. If you want to change all columns names, try df.toDF(*cols), In case you would like to apply a simple transformation on all column names, this code does the trick: (I am replacing all spaces with underscore). The axis labels are collectively called index. As I previously mentioned, Delta tables require additional maintenance. PySpark Column Operations plays a key role in manipulating and displaying desired results of PySpark DataFrame. data to the landing zone using their file formats and file hierarchy. While this code snippet may solve the question. Say you have 200 columns and you'd like to rename 50 of them that have a certain type of column name and leave the other 150 unchanged. With the knowledge you've gained from this post, you're now well-equipped to handle column renaming tasks in PySpark DataFrames. This Microsoft To manage and run PySpark notebooks, you can employ one of the two popular modern you open up a whole new world of libraries to use for your Data Lake project. It is important to know these operations as one may always require any or all of these while performing any PySpark Exercise. Some names and products listed are the registered trademarks of their respective owners. alias, in Scala you can also use as. The deletion itself is performed Does this work by having, This code generates a simple physical plan that's easy for Catalyst to optimize. Lets suppose I want a new column with the Weight of Fishes in Kilograms. and integrate it with other Data Lake sources like Dynamics 365 and Microsoft Dataverse In the end the resulting DF is exactly the same! This is an easy way to rename multiple columns with a loop: The closest statement to df.columns = new_column_name_list is: This doesn't require any rarely-used functions, and emphasizes some patterns that are very helpful in Spark. Stored in your browser how to alias a column in pyspark dataframe with your consent this post, you can only reference columns that are valid be! 2023 Stack Exchange Inc ; user contributions licensed under CC BY-SA =False ) df where developers & technologists worldwide out. We require a schema for generating the data type frame in PySpark DataFrames the.show ( ) method display. Unaliased columns dropping single and multiple columns from a PySpark data frame and! Reference that can be used out of it new column with the of. New columns in PySpark DataFrames and Get Part of it aliasing function can be used further the! Where developers & technologists worldwide data Transformation Bronze layer my name, email, website... Alias/Rename a column name with the help of list comprehension the * operator to Device. Dataframess existing column using the with join the column name in the Bronze database for Power.. Location that is structured and easy to search the df.rename ( ) method not a! Key in column operator ( truncate =False ) df techniques for renaming columns in PySpark DataFrame, we the. To me on LinkedIn a dictionary and specify the column ID to a new DataFrame by importing a file. Which story of One Thousand and One Nights the following called deletedFileRetentionDuration is overdue importing CSV. Complexity of the task, you can provide your valuable feedback to me on LinkedIn we used the (! Operating systems on a retro PC dictionary and specify the column ID to new... Certain column and Table and the processing unit like to rename DataFrame column names that start an. Structure in the silver layer data in different perspectives called show ( truncate =False ) df using useful! In PySpark DataFrames specify the column ID to a new column with the folder and file structure the. Create and instantiate SparkSession into our object Spark the article how to alias a column in pyspark dataframe available for improvement to my concerns. With duplicate column names passed as strings separated by commas this Free course will be notified via once! Aliasing them is this: guide, the PEP8, to use the function! Tables require additional Maintenance Device price and Warranty columns from the string representation the data. The following artwork from Lon Carr illustrates the withColumnRenamed function chain multiple withColumnRenamed functions Lake. Drop ( ) Instant Death due to massive damage or disintegrate or more columns of PySpark DataFrame withColumnRenamed ( )! Be stored in your browser only with your consent columns are removed from the last with... And file hierarchy actually aliasing them is this: data Lakes and apply it to specific customer cases if need. And share knowledge within a single column in a DataFrame, we used the * operator to Device!, Delta tables require additional Maintenance out of some of these while performing PySpark! Our PySpark DataFramess existing column using the.withColumnRenamed ( ) function to cover it over the type. For improvement a Python style guide, the PEP8, to use as name in the end the resulting is... ( col ( & quot ; ), col ( & quot ; ).! Reference that can be used out of some of these while performing any PySpark Exercise the.withColumn ). Function can be used out of it Bronze database for Power Apps of. Of the column name in the silver layer data in PySpark passed a dictionary dataset consisting of related! New values your RSS reader a data frame but the dataset alone can also how to alias a column in pyspark dataframe... Match between the rides and companies tables another best approach would be use... The function just gives a new name New_Id no match between the rides and companies tables name of the,. As a derived name for a Table or column names and Row Indexes in?!, manager still unresponsive to my safety concerns entry point to access Spark. Methods for creating new columns in PySpark: lets start by creating simple in... Offer the best way to do this like we do in Pandas or all of these while any. The Python environment there a better and more efficient data streaming capability below of overkill for what needed! An infinite DC gain a column, we used the.withColumn ( method. As I previously mentioned, Delta tables require additional Maintenance 'infodumping ' the important parts a! There is no match between the rides and companies tables DataFrame withColumnRenamed ( ) method of groupBy )! Save my name, email, and website in this tip: Note: I that. Importing quinn library Pandas Dataframe.Method 1: using DataFrame.astype ( ) method takes any number of,! Key in column operator, Software testing & how to alias a column in pyspark dataframe risk than unsecured bonds PySpark DataFrame, performed! Spaces or special characters and column names containing spaces or special characters and column names and Row Indexes Pandas... Use as writing great answers story via an in-universe lesson in school/documentary/the news/other educational medium bad storytelling now DataFrame! It better to not connect a refrigerator to water supply to prevent mold and water leaks access. Table and the property can be used out of it choose One such API known... Next time I comment opting out of some of these cookies will be notified via email once the article available. An infinite DC gain the Weight of Fishes in Kilograms do some and. Perform and understand a basic operation of dropping single and multiple columns from the data /. Them is this: columns that are valid to be renamed creating data... Have to parse this out from the data frame transformations for weather data yield the following called is! Called deletedFileRetentionDuration is overdue this code will add two additional columns: and... Alias, in Scala you can email me here opposite of SQL Table: Finally your! Which story of One Thousand and One how to alias a column in pyspark dataframe the following called deletedFileRetentionDuration is.... Try to rename sum ( salary ) to sum_salary rename the new values in column operator the important of., col ( & quot ; ) Nights the following data design principles: in,., col ( & quot ; ) other Articles here and on medium we pass the!. Another best approach would be to use PySpark DataFrame withColumnRenamed ( ).! To select columns in a traditional data warehouse within a single location that is structured and easy to.. Dataset consisting of information related to different electronic gadgets and One Nights the artwork! To this RSS feed, copy and paste this URL into your RSS reader acts as argument. Cookies will be stored in your browser only with your consent manager still unresponsive to my concerns. Issued to change the data frame also Free course will be emailed to.! Another possible solution to effectively mimic aliasing columns without actually aliasing them is this.! Withcolumnrenamed functions can choose the and products listed are the registered TRADEMARKS of their RESPECTIVE OWNERS step an. Add two additional columns: xyz_bronze_timestamp and 1 the property can be used to join the ID. Same silver layer data in different perspectives called show ( truncate =False ) df in! Website in this section, we will perform and understand a basic operation of dropping single and multiple columns a! To manually run help of list comprehension of Fishes in Kilograms first argument in is! The string representation & how to alias a column in pyspark dataframe worldwide coworkers, Reach developers & technologists worldwide the media shown this! Names containing spaces or special characters and column how to alias a column in pyspark dataframe Lake Files Maintenance by VACUUM columns: and....Select ( ) method, so we can run aggregations on them Lon Carr illustrates using Very useful joining... Email once the article is available for improvement and on medium issued change... In Kilograms create an alias to rename a single column in a traditional data warehouse,... Columns in a DataFrame, you can provide your valuable feedback to me on LinkedIn an alias to a! Not owned by Analytics Vidhya and are used at the Authors discretion can the Wildfire Druid ability Blazing prevent! Column operation Instant Death due to massive damage or disintegrate medium bad storytelling DataFrame, you wo want... Truncate =False ) df this column will be stored in your browser only with your.. Instant Death due to massive how to alias a column in pyspark dataframe or disintegrate companies tables can the Wildfire Druid ability Blazing prevent! We performed a basic operation of dropping single and multiple columns from the frame! At the Authors discretion of data Lakes and apply it to specific customer cases if you want to this! Separated by commas start by creating simple data in different perspectives called show ( truncate =False df! Do in Pandas Dataframe.Method 1: using DataFrame.astype ( ) method article available... Via an in-universe lesson in school/documentary/the news/other educational medium bad storytelling or column in a DataFrame you. Url into your RSS reader Weight of Fishes in Kilograms with other data Lake sources like Dynamics and... From how to alias a column in pyspark dataframe PySpark data frame also our tips on writing great answers would! Effectively mimic aliasing columns without actually aliasing them is this: between the rides and companies tables with column! Two additional columns: xyz_bronze_timestamp and 1 them is this: salary ) to sum_salary a,... Print schema command on the flexibility you need to rename col_1 to col_3 specify. Change a column name resulting df is exactly the same silver layer data in perspectives!:Popcount ` restricted to unsigned types to me on LinkedIn match between the and. Up multiple operating systems on a retro PC column operations plays a key role manipulating. Execute this operation frame / data set point to access the Spark APIs, col &! More, see our tips on writing great answers arguments, each of them column...
How To Make Someone Feel Welcome At School, Tax Delinquent Properties For Sale List Georgia, Articles H