diff --git a/HunYuanDiT/LICENSE-HYDiT b/HunYuanDiT/LICENSE-HYDiT deleted file mode 100644 index 61ea65d..0000000 --- a/HunYuanDiT/LICENSE-HYDiT +++ /dev/null @@ -1,74 +0,0 @@ -TENCENT HUNYUAN COMMUNITY LICENSE AGREEMENT -Tencent Hunyuan Release Date: 2024/5/14 -By clicking to agree or by using, reproducing, modifying, distributing, performing or displaying any portion or element of the Tencent Hunyuan Works, including via any Hosted Service, You will be deemed to have recognized and accepted the content of this Agreement, which is effective immediately. -1. DEFINITIONS. -a. “Acceptable Use Policy” shall mean the policy made available by Tencent as set forth in the Exhibit A. -b. “Agreement” shall mean the terms and conditions for use, reproduction, distribution, modification, performance and displaying of the Hunyuan Works or any portion or element thereof set forth herein. -c. “Documentation” shall mean the specifications, manuals and documentation for Tencent Hunyuan made publicly available by Tencent. -d. “Hosted Service” shall mean a hosted service offered via an application programming interface (API), web access, or any other electronic or remote means. -e. “Licensee,” “You” or “Your” shall mean a natural person or legal entity exercising the rights granted by this Agreement and/or using the Tencent Hunyuan Works for any purpose and in any field of use. -f. “Materials” shall mean, collectively, Tencent’s proprietary Tencent Hunyuan and Documentation (and any portion thereof) as made available by Tencent under this Agreement. -g. “Model Derivatives” shall mean all: (i) modifications to Tencent Hunyuan or any Model Derivative of Tencent Hunyuan; (ii) works based on Tencent Hunyuan or any Model Derivative of Tencent Hunyuan; or (iii) any other machine learning model which is created by transfer of patterns of the weights, parameters, operations, or Output of Tencent Hunyuan or any Model Derivative of Tencent Hunyuan, to that model in order to cause that model to perform similarly to Tencent Hunyuan or a Model Derivative of Tencent Hunyuan, including distillation methods, methods that use intermediate data representations, or methods based on the generation of synthetic data Outputs by Tencent Hunyuan or a Model Derivative of Tencent Hunyuan for training that model. For clarity, Outputs by themselves are not deemed Model Derivatives. -h. “Output” shall mean the information and/or content output of Tencent Hunyuan or a Model Derivative that results from operating or otherwise using Tencent Hunyuan or a Model Derivative, including via a Hosted Service. -i. “Tencent,” “We” or “Us” shall mean THL A29 Limited. -j. “Tencent Hunyuan” shall mean the large language models, image/video/audio/3D generation models, and multimodal large language models and their software and algorithms, including trained model weights, parameters (including optimizer states), machine-learning model code, inference-enabling code, training-enabling code, fine-tuning enabling code and other elements of the foregoing made publicly available by Us at https://huggingface.co/Tencent-Hunyuan/HunyuanDiT and https://github.com/Tencent/HunyuanDiT . -k. “Tencent Hunyuan Works” shall mean: (i) the Materials; (ii) Model Derivatives; and (iii) all derivative works thereof. -l. “Third Party” or “Third Parties” shall mean individuals or legal entities that are not under common control with Us or You. -m. “including” shall mean including but not limited to. -2. GRANT OF RIGHTS. -We grant You a non-exclusive, worldwide, non-transferable and royalty-free limited license under Tencent’s intellectual property or other rights owned by Us embodied in or utilized by the Materials to use, reproduce, distribute, create derivative works of (including Model Derivatives), and make modifications to the Materials, only in accordance with the terms of this Agreement and the Acceptable Use Policy, and You must not violate (or encourage or permit anyone else to violate) any term of this Agreement or the Acceptable Use Policy. -3. DISTRIBUTION. -You may, subject to Your compliance with this Agreement, distribute or make available to Third Parties the Tencent Hunyuan Works, provided that You meet all of the following conditions: -a. You must provide all such Third Party recipients of the Tencent Hunyuan Works or products or services using them a copy of this Agreement; -b. You must cause any modified files to carry prominent notices stating that You changed the files; -c. You are encouraged to: (i) publish at least one technology introduction blogpost or one public statement expressing Your experience of using the Tencent Hunyuan Works; and (ii) mark the products or services developed by using the Tencent Hunyuan Works to indicate that the product/service is “Powered by Tencent Hunyuan”; and -d. All distributions to Third Parties (other than through a Hosted Service) must be accompanied by a “Notice” text file that contains the following notice: “Tencent Hunyuan is licensed under the Tencent Hunyuan Community License Agreement, Copyright © 2024 Tencent. All Rights Reserved. The trademark rights of “Tencent Hunyuan” are owned by Tencent or its affiliate.” -You may add Your own copyright statement to Your modifications and, except as set forth in this Section and in Section 5, may provide additional or different license terms and conditions for use, reproduction, or distribution of Your modifications, or for any such Model Derivatives as a whole, provided Your use, reproduction, modification, distribution, performance and display of the work otherwise complies with the terms and conditions of this Agreement. If You receive Tencent Hunyuan Works from a Licensee as part of an integrated end user product, then this Section 3 of this Agreement will not apply to You. -4. ADDITIONAL COMMERCIAL TERMS. -If, on the Tencent Hunyuan version release date, the monthly active users of all products or services made available by or for Licensee is greater than 100 million monthly active users in the preceding calendar month, You must request a license from Tencent, which Tencent may grant to You in its sole discretion, and You are not authorized to exercise any of the rights under this Agreement unless or until Tencent otherwise expressly grants You such rights. -5. RULES OF USE. -a. Your use of the Tencent Hunyuan Works must comply with applicable laws and regulations (including trade compliance laws and regulations) and adhere to the Acceptable Use Policy for the Tencent Hunyuan Works, which is hereby incorporated by reference into this Agreement. You must include the use restrictions referenced in these Sections 5(a) and 5(b) as an enforceable provision in any agreement (e.g., license agreement, terms of use, etc.) governing the use and/or distribution of Tencent Hunyuan Works and You must provide notice to subsequent users to whom You distribute that Tencent Hunyuan Works are subject to the use restrictions in these Sections 5(a) and 5(b). -b. You must not use the Tencent Hunyuan Works or any Output or results of the Tencent Hunyuan Works to improve any other large language model (other than Tencent Hunyuan or Model Derivatives thereof). -6. INTELLECTUAL PROPERTY. -a. Subject to Tencent’s ownership of Tencent Hunyuan Works made by or for Tencent and intellectual property rights therein, conditioned upon Your compliance with the terms and conditions of this Agreement, as between You and Tencent, You will be the owner of any derivative works and modifications of the Materials and any Model Derivatives that are made by or for You. -b. No trademark licenses are granted under this Agreement, and in connection with the Tencent Hunyuan Works, Licensee may not use any name or mark owned by or associated with Tencent or any of its affiliates, except as required for reasonable and customary use in describing and distributing the Tencent Hunyuan Works. Tencent hereby grants You a license to use “Tencent Hunyuan” (the “Mark”) solely as required to comply with the provisions of Section 3(c), provided that You comply with any applicable laws related to trademark protection. All goodwill arising out of Your use of the Mark will inure to the benefit of Tencent. -c. If You commence a lawsuit or other proceedings (including a cross-claim or counterclaim in a lawsuit) against Us or any person or entity alleging that the Materials or any Output, or any portion of any of the foregoing, infringe any intellectual property or other right owned or licensable by You, then all licenses granted to You under this Agreement shall terminate as of the date such lawsuit or other proceeding is filed. You will defend, indemnify and hold harmless Us from and against any claim by any Third Party arising out of or related to Your or the Third Party’s use or distribution of the Tencent Hunyuan Works. -d. Tencent claims no rights in Outputs You generate. You and Your users are solely responsible for Outputs and their subsequent uses. -7. DISCLAIMERS OF WARRANTY AND LIMITATIONS OF LIABILITY. -a. We are not obligated to support, update, provide training for, or develop any further version of the Tencent Hunyuan Works or to grant any license thereto. -b. UNLESS AND ONLY TO THE EXTENT REQUIRED BY APPLICABLE LAW, THE TENCENT HUNYUAN WORKS AND ANY OUTPUT AND RESULTS THEREFROM ARE PROVIDED “AS IS” WITHOUT ANY EXPRESS OR IMPLIED WARRANTIES OF ANY KIND INCLUDING ANY WARRANTIES OF TITLE, MERCHANTABILITY, NONINFRINGEMENT, COURSE OF DEALING, USAGE OF TRADE, OR FITNESS FOR A PARTICULAR PURPOSE. YOU ARE SOLELY RESPONSIBLE FOR DETERMINING THE APPROPRIATENESS OF USING, REPRODUCING, MODIFYING, PERFORMING, DISPLAYING OR DISTRIBUTING ANY OF THE TENCENT HUNYUAN WORKS OR OUTPUTS AND ASSUME ANY AND ALL RISKS ASSOCIATED WITH YOUR OR A THIRD PARTY’S USE OR DISTRIBUTION OF ANY OF THE TENCENT HUNYUAN WORKS OR OUTPUTS AND YOUR EXERCISE OF RIGHTS AND PERMISSIONS UNDER THIS AGREEMENT. -c. TO THE FULLEST EXTENT PERMITTED BY APPLICABLE LAW, IN NO EVENT SHALL TENCENT OR ITS AFFILIATES BE LIABLE UNDER ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, TORT, NEGLIGENCE, PRODUCTS LIABILITY, OR OTHERWISE, FOR ANY DAMAGES, INCLUDING ANY DIRECT, INDIRECT, SPECIAL, INCIDENTAL, EXEMPLARY, CONSEQUENTIAL OR PUNITIVE DAMAGES, OR LOST PROFITS OF ANY KIND ARISING FROM THIS AGREEMENT OR RELATED TO ANY OF THE TENCENT HUNYUAN WORKS OR OUTPUTS, EVEN IF TENCENT OR ITS AFFILIATES HAVE BEEN ADVISED OF THE POSSIBILITY OF ANY OF THE FOREGOING. -8. SURVIVAL AND TERMINATION. -a. The term of this Agreement shall commence upon Your acceptance of this Agreement or access to the Materials and will continue in full force and effect until terminated in accordance with the terms and conditions herein. -b. We may terminate this Agreement if You breach any of the terms or conditions of this Agreement. Upon termination of this Agreement, You must promptly delete and cease use of the Tencent Hunyuan Works. Sections 6(a), 6(c), 7 and 9 shall survive the termination of this Agreement. -9. GOVERNING LAW AND JURISDICTION. -a. This Agreement and any dispute arising out of or relating to it will be governed by the laws of the Hong Kong Special Administrative Region of the People’s Republic of China, without regard to conflict of law principles, and the UN Convention on Contracts for the International Sale of Goods does not apply to this Agreement. -b. Exclusive jurisdiction and venue for any dispute arising out of or relating to this Agreement will be a court of competent jurisdiction in the Hong Kong Special Administrative Region of the People’s Republic of China, and Tencent and Licensee consent to the exclusive jurisdiction of such court with respect to any such dispute. -  - -EXHIBIT A -ACCEPTABLE USE POLICY - -Tencent reserves the right to update this Acceptable Use Policy from time to time. -Last modified: 2024/5/14 - -Tencent endeavors to promote safe and fair use of its tools and features, including Tencent Hunyuan. You agree not to use Tencent Hunyuan or Model Derivatives: -1. In any way that violates any applicable national, federal, state, local, international or any other law or regulation; -2. To harm Yourself or others; -3. To repurpose or distribute output from Tencent Hunyuan or any Model Derivatives to harm Yourself or others; -4. To override or circumvent the safety guardrails and safeguards We have put in place; -5. For the purpose of exploiting, harming or attempting to exploit or harm minors in any way; -6. To generate or disseminate verifiably false information and/or content with the purpose of harming others or influencing elections; -7. To generate or facilitate false online engagement, including fake reviews and other means of fake online engagement; -8. To intentionally defame, disparage or otherwise harass others; -9. To generate and/or disseminate malware (including ransomware) or any other content to be used for the purpose of harming electronic systems; -10. To generate or disseminate personal identifiable information with the purpose of harming others; -11. To generate or disseminate information (including images, code, posts, articles), and place the information in any public context (including –through the use of bot generated tweets), without expressly and conspicuously identifying that the information and/or content is machine generated; -12. To impersonate another individual without consent, authorization, or legal right; -13. To make high-stakes automated decisions in domains that affect an individual’s safety, rights or wellbeing (e.g., law enforcement, migration, medicine/health, management of critical infrastructure, safety components of products, essential services, credit, employment, housing, education, social scoring, or insurance); -14. In a manner that violates or disrespects the social ethics and moral standards of other countries or regions; -15. To perform, facilitate, threaten, incite, plan, promote or encourage violent extremism or terrorism; -16. For any use intended to discriminate against or harm individuals or groups based on protected characteristics or categories, online or offline social behavior or known or predicted personal or personality characteristics; -17. To intentionally exploit any of the vulnerabilities of a specific group of persons based on their age, social, physical or mental characteristics, in order to materially distort the behavior of a person pertaining to that group in a manner that causes or is likely to cause that person or another person physical or psychological harm; -18. For military purposes; -19. To engage in the unauthorized or unlicensed practice of any profession including, but not limited to, financial, legal, medical/health, or other professional practices. diff --git a/HunYuanDiT/conf.py b/HunYuanDiT/conf.py deleted file mode 100644 index 4322753..0000000 --- a/HunYuanDiT/conf.py +++ /dev/null @@ -1,61 +0,0 @@ -""" -List of all HYDiT model types / settings -""" -from argparse import Namespace -hydit_args = Namespace(**{ # normally from argparse - "infer_mode": "torch", - "norm": "layer", - "learn_sigma": True, - "text_states_dim": 1024, - "text_states_dim_t5": 2048, - "text_len": 77, - "text_len_t5": 256, -}) - -hydit_conf = { - "G/2": { # Seems to be the main one - "unet_config": { - "depth" : 40, - "num_heads" : 16, - "patch_size" : 2, - "hidden_size" : 1408, - "mlp_ratio" : 4.3637, - "input_size": (1024//8, 1024//8), - "args": hydit_args, - }, - "sampling_settings" : { - "beta_schedule" : "linear", - "linear_start" : 0.00085, - "linear_end" : 0.03, - "timesteps" : 1000, - }, - }, - "G/2-1.2": { - "unet_config": { - "depth" : 40, - "num_heads" : 16, - "patch_size" : 2, - "hidden_size" : 1408, - "mlp_ratio" : 4.3637, - "input_size": (1024//8, 1024//8), - "cond_style": False, - "cond_res" : False, - "args": hydit_args, - }, - "sampling_settings" : { - "beta_schedule" : "linear", - "linear_start" : 0.00085, - "linear_end" : 0.018, - "timesteps" : 1000, - }, - } -} - -# these are the same as regular DiT, I think -from ..DiT.conf import dit_conf -for name in ["XL/2", "L/2", "B/2"]: - hydit_conf[name] = { - "unet_config": dit_conf[name]["unet_config"].copy(), - "sampling_settings": hydit_conf["G/2"]["sampling_settings"], - } - hydit_conf[name]["unet_config"]["args"] = hydit_args diff --git a/HunYuanDiT/config_clip.json b/HunYuanDiT/config_clip.json deleted file mode 100644 index f629874..0000000 --- a/HunYuanDiT/config_clip.json +++ /dev/null @@ -1,34 +0,0 @@ -{ - "_name_or_path": "hfl/chinese-roberta-wwm-ext-large", - "architectures": [ - "BertModel" - ], - "attention_probs_dropout_prob": 0.1, - "bos_token_id": 0, - "classifier_dropout": null, - "directionality": "bidi", - "eos_token_id": 2, - "hidden_act": "gelu", - "hidden_dropout_prob": 0.1, - "hidden_size": 1024, - "initializer_range": 0.02, - "intermediate_size": 4096, - "layer_norm_eps": 1e-12, - "max_position_embeddings": 512, - "model_type": "bert", - "num_attention_heads": 16, - "num_hidden_layers": 24, - "output_past": true, - "pad_token_id": 0, - "pooler_fc_size": 768, - "pooler_num_attention_heads": 12, - "pooler_num_fc_layers": 3, - "pooler_size_per_head": 128, - "pooler_type": "first_token_transform", - "position_embedding_type": "absolute", - "torch_dtype": "float32", - "transformers_version": "4.22.1", - "type_vocab_size": 2, - "use_cache": true, - "vocab_size": 47020 -} diff --git a/HunYuanDiT/config_mt5.json b/HunYuanDiT/config_mt5.json deleted file mode 100644 index 825cb35..0000000 --- a/HunYuanDiT/config_mt5.json +++ /dev/null @@ -1,33 +0,0 @@ -{ - "_name_or_path": "mt5", - "architectures": [ - "MT5EncoderModel" - ], - "classifier_dropout": 0.0, - "d_ff": 5120, - "d_kv": 64, - "d_model": 2048, - "decoder_start_token_id": 0, - "dense_act_fn": "gelu_new", - "dropout_rate": 0.1, - "eos_token_id": 1, - "feed_forward_proj": "gated-gelu", - "initializer_factor": 1.0, - "is_encoder_decoder": true, - "is_gated_act": true, - "layer_norm_epsilon": 1e-06, - "model_type": "mt5", - "num_decoder_layers": 24, - "num_heads": 32, - "num_layers": 24, - "output_past": true, - "pad_token_id": 0, - "relative_attention_max_distance": 128, - "relative_attention_num_buckets": 32, - "tie_word_embeddings": false, - "tokenizer_class": "T5Tokenizer", - "torch_dtype": "float16", - "transformers_version": "4.40.2", - "use_cache": true, - "vocab_size": 250112 -} diff --git a/HunYuanDiT/loader.py b/HunYuanDiT/loader.py deleted file mode 100644 index 23be529..0000000 --- a/HunYuanDiT/loader.py +++ /dev/null @@ -1,80 +0,0 @@ -import comfy.supported_models_base -import comfy.latent_formats -import comfy.model_patcher -import comfy.model_base -import comfy.utils -import comfy.conds -import torch -from comfy import model_management -from tqdm import tqdm - -class EXM_HYDiT(comfy.supported_models_base.BASE): - unet_config = {} - unet_extra_config = {} - latent_format = comfy.latent_formats.SDXL - - def __init__(self, model_conf): - self.unet_config = model_conf.get("unet_config", {}) - self.sampling_settings = model_conf.get("sampling_settings", {}) - self.latent_format = self.latent_format() - # UNET is handled by extension - self.unet_config["disable_unet_model_creation"] = True - - def model_type(self, state_dict, prefix=""): - return comfy.model_base.ModelType.V_PREDICTION - -class EXM_HYDiT_Model(comfy.model_base.BaseModel): - def __init__(self, *args, **kwargs): - super().__init__(*args, **kwargs) - - def extra_conds(self, **kwargs): - out = super().extra_conds(**kwargs) - - for name in ["context_t5", "context_mask", "context_t5_mask"]: - out[name] = comfy.conds.CONDRegular(kwargs[name]) - - src_size_cond = kwargs.get("src_size_cond", None) - if src_size_cond is not None: - out["src_size_cond"] = comfy.conds.CONDRegular(torch.tensor(src_size_cond)) - - return out - -def load_hydit(model_path, model_conf): - state_dict = comfy.utils.load_torch_file(model_path) - state_dict = state_dict.get("model", state_dict) - - parameters = comfy.utils.calculate_parameters(state_dict) - unet_dtype = model_management.unet_dtype(model_params=parameters) - load_device = comfy.model_management.get_torch_device() - offload_device = comfy.model_management.unet_offload_device() - - # ignore fp8/etc and use directly for now - manual_cast_dtype = model_management.unet_manual_cast(unet_dtype, load_device) - if manual_cast_dtype: - print(f"HunYuanDiT: falling back to {manual_cast_dtype}") - unet_dtype = manual_cast_dtype - - model_conf = EXM_HYDiT(model_conf) - model = EXM_HYDiT_Model( - model_conf, - model_type=comfy.model_base.ModelType.V_PREDICTION, - device=model_management.get_torch_device() - ) - - from .models.models import HunYuanDiT - model.diffusion_model = HunYuanDiT( - **model_conf.unet_config, - log_fn=tqdm.write, - ) - - model.diffusion_model.load_state_dict(state_dict) - model.diffusion_model.dtype = unet_dtype - model.diffusion_model.eval() - model.diffusion_model.to(unet_dtype) - - model_patcher = comfy.model_patcher.ModelPatcher( - model, - load_device = load_device, - offload_device = offload_device, - ) - return model_patcher diff --git a/HunYuanDiT/models/attn_layers.py b/HunYuanDiT/models/attn_layers.py deleted file mode 100644 index b767d83..0000000 --- a/HunYuanDiT/models/attn_layers.py +++ /dev/null @@ -1,374 +0,0 @@ -import torch -import torch.nn as nn -from typing import Tuple, Union, Optional - -try: - import flash_attn - if hasattr(flash_attn, '__version__') and int(flash_attn.__version__[0]) == 2: - from flash_attn.flash_attn_interface import flash_attn_kvpacked_func - from flash_attn.modules.mha import FlashSelfAttention, FlashCrossAttention - else: - from flash_attn.flash_attn_interface import flash_attn_unpadded_kvpacked_func - from flash_attn.modules.mha import FlashSelfAttention, FlashCrossAttention -except Exception as e: - print(f'flash_attn import failed: {e}') - - -def reshape_for_broadcast(freqs_cis: Union[torch.Tensor, Tuple[torch.Tensor]], x: torch.Tensor, head_first=False): - """ - Reshape frequency tensor for broadcasting it with another tensor. - - This function reshapes the frequency tensor to have the same shape as the target tensor 'x' - for the purpose of broadcasting the frequency tensor during element-wise operations. - - Args: - freqs_cis (Union[torch.Tensor, Tuple[torch.Tensor]]): Frequency tensor to be reshaped. - x (torch.Tensor): Target tensor for broadcasting compatibility. - head_first (bool): head dimension first (except batch dim) or not. - - Returns: - torch.Tensor: Reshaped frequency tensor. - - Raises: - AssertionError: If the frequency tensor doesn't match the expected shape. - AssertionError: If the target tensor 'x' doesn't have the expected number of dimensions. - """ - ndim = x.ndim - assert 0 <= 1 < ndim - - if isinstance(freqs_cis, tuple): - # freqs_cis: (cos, sin) in real space - if head_first: - assert freqs_cis[0].shape == (x.shape[-2], x.shape[-1]), f'freqs_cis shape {freqs_cis[0].shape} does not match x shape {x.shape}' - shape = [d if i == ndim - 2 or i == ndim - 1 else 1 for i, d in enumerate(x.shape)] - else: - assert freqs_cis[0].shape == (x.shape[1], x.shape[-1]), f'freqs_cis shape {freqs_cis[0].shape} does not match x shape {x.shape}' - shape = [d if i == 1 or i == ndim - 1 else 1 for i, d in enumerate(x.shape)] - return freqs_cis[0].view(*shape), freqs_cis[1].view(*shape) - else: - # freqs_cis: values in complex space - if head_first: - assert freqs_cis.shape == (x.shape[-2], x.shape[-1]), f'freqs_cis shape {freqs_cis.shape} does not match x shape {x.shape}' - shape = [d if i == ndim - 2 or i == ndim - 1 else 1 for i, d in enumerate(x.shape)] - else: - assert freqs_cis.shape == (x.shape[1], x.shape[-1]), f'freqs_cis shape {freqs_cis.shape} does not match x shape {x.shape}' - shape = [d if i == 1 or i == ndim - 1 else 1 for i, d in enumerate(x.shape)] - return freqs_cis.view(*shape) - - -def rotate_half(x): - x_real, x_imag = x.float().reshape(*x.shape[:-1], -1, 2).unbind(-1) # [B, S, H, D//2] - return torch.stack([-x_imag, x_real], dim=-1).flatten(3) - - -def apply_rotary_emb( - xq: torch.Tensor, - xk: Optional[torch.Tensor], - freqs_cis: Union[torch.Tensor, Tuple[torch.Tensor]], - head_first: bool = False, -) -> Tuple[torch.Tensor, torch.Tensor]: - """ - Apply rotary embeddings to input tensors using the given frequency tensor. - - This function applies rotary embeddings to the given query 'xq' and key 'xk' tensors using the provided - frequency tensor 'freqs_cis'. The input tensors are reshaped as complex numbers, and the frequency tensor - is reshaped for broadcasting compatibility. The resulting tensors contain rotary embeddings and are - returned as real tensors. - - Args: - xq (torch.Tensor): Query tensor to apply rotary embeddings. [B, S, H, D] - xk (torch.Tensor): Key tensor to apply rotary embeddings. [B, S, H, D] - freqs_cis (Union[torch.Tensor, Tuple[torch.Tensor]]): Precomputed frequency tensor for complex exponentials. - head_first (bool): head dimension first (except batch dim) or not. - - Returns: - Tuple[torch.Tensor, torch.Tensor]: Tuple of modified query tensor and key tensor with rotary embeddings. - - """ - xk_out = None - if isinstance(freqs_cis, tuple): - cos, sin = reshape_for_broadcast(freqs_cis, xq, head_first) # [S, D] - cos, sin = cos.to(xq.device), sin.to(xq.device) - xq_out = (xq.float() * cos + rotate_half(xq.float()) * sin).type_as(xq) - if xk is not None: - xk_out = (xk.float() * cos + rotate_half(xk.float()) * sin).type_as(xk) - else: - xq_ = torch.view_as_complex(xq.float().reshape(*xq.shape[:-1], -1, 2)) # [B, S, H, D//2] - freqs_cis = reshape_for_broadcast(freqs_cis, xq_, head_first).to(xq.device) # [S, D//2] --> [1, S, 1, D//2] - xq_out = torch.view_as_real(xq_ * freqs_cis).flatten(3).type_as(xq) - if xk is not None: - xk_ = torch.view_as_complex(xk.float().reshape(*xk.shape[:-1], -1, 2)) # [B, S, H, D//2] - xk_out = torch.view_as_real(xk_ * freqs_cis).flatten(3).type_as(xk) - - return xq_out, xk_out - - -class FlashSelfMHAModified(nn.Module): - """ - Use QK Normalization. - """ - def __init__(self, - dim, - num_heads, - qkv_bias=True, - qk_norm=False, - attn_drop=0.0, - proj_drop=0.0, - device=None, - dtype=None, - norm_layer=nn.LayerNorm, - ): - factory_kwargs = {'device': device, 'dtype': dtype} - super().__init__() - self.dim = dim - self.num_heads = num_heads - assert self.dim % num_heads == 0, "self.kdim must be divisible by num_heads" - self.head_dim = self.dim // num_heads - assert self.head_dim % 8 == 0 and self.head_dim <= 128, "Only support head_dim <= 128 and divisible by 8" - - self.Wqkv = nn.Linear(dim, 3 * dim, bias=qkv_bias, **factory_kwargs) - # TODO: eps should be 1 / 65530 if using fp16 - self.q_norm = norm_layer(self.head_dim, elementwise_affine=True, eps=1e-6) if qk_norm else nn.Identity() - self.k_norm = norm_layer(self.head_dim, elementwise_affine=True, eps=1e-6) if qk_norm else nn.Identity() - self.inner_attn = FlashSelfAttention(attention_dropout=attn_drop) - self.out_proj = nn.Linear(dim, dim, bias=qkv_bias, **factory_kwargs) - self.proj_drop = nn.Dropout(proj_drop) - - def forward(self, x, freqs_cis_img=None): - """ - Parameters - ---------- - x: torch.Tensor - (batch, seqlen, hidden_dim) (where hidden_dim = num heads * head dim) - freqs_cis_img: torch.Tensor - (batch, hidden_dim // 2), RoPE for image - """ - b, s, d = x.shape - - qkv = self.Wqkv(x) - qkv = qkv.view(b, s, 3, self.num_heads, self.head_dim) # [b, s, 3, h, d] - q, k, v = qkv.unbind(dim=2) # [b, s, h, d] - q = self.q_norm(q).half() # [b, s, h, d] - k = self.k_norm(k).half() - - # Apply RoPE if needed - if freqs_cis_img is not None: - qq, kk = apply_rotary_emb(q, k, freqs_cis_img) - assert qq.shape == q.shape and kk.shape == k.shape, f'qq: {qq.shape}, q: {q.shape}, kk: {kk.shape}, k: {k.shape}' - q, k = qq, kk - - qkv = torch.stack([q, k, v], dim=2) # [b, s, 3, h, d] - context = self.inner_attn(qkv) - out = self.out_proj(context.view(b, s, d)) - out = self.proj_drop(out) - - out_tuple = (out,) - - return out_tuple - - -class FlashCrossMHAModified(nn.Module): - """ - Use QK Normalization. - """ - def __init__(self, - qdim, - kdim, - num_heads, - qkv_bias=True, - qk_norm=False, - attn_drop=0.0, - proj_drop=0.0, - device=None, - dtype=None, - norm_layer=nn.LayerNorm, - ): - factory_kwargs = {'device': device, 'dtype': dtype} - super().__init__() - self.qdim = qdim - self.kdim = kdim - self.num_heads = num_heads - assert self.qdim % num_heads == 0, "self.qdim must be divisible by num_heads" - self.head_dim = self.qdim // num_heads - assert self.head_dim % 8 == 0 and self.head_dim <= 128, "Only support head_dim <= 128 and divisible by 8" - - self.scale = self.head_dim ** -0.5 - - self.q_proj = nn.Linear(qdim, qdim, bias=qkv_bias, **factory_kwargs) - self.kv_proj = nn.Linear(kdim, 2 * qdim, bias=qkv_bias, **factory_kwargs) - - # TODO: eps should be 1 / 65530 if using fp16 - self.q_norm = norm_layer(self.head_dim, elementwise_affine=True, eps=1e-6) if qk_norm else nn.Identity() - self.k_norm = norm_layer(self.head_dim, elementwise_affine=True, eps=1e-6) if qk_norm else nn.Identity() - - self.inner_attn = FlashCrossAttention(attention_dropout=attn_drop) - self.out_proj = nn.Linear(qdim, qdim, bias=qkv_bias, **factory_kwargs) - self.proj_drop = nn.Dropout(proj_drop) - - def forward(self, x, y, freqs_cis_img=None): - """ - Parameters - ---------- - x: torch.Tensor - (batch, seqlen1, hidden_dim) (where hidden_dim = num_heads * head_dim) - y: torch.Tensor - (batch, seqlen2, hidden_dim2) - freqs_cis_img: torch.Tensor - (batch, hidden_dim // num_heads), RoPE for image - """ - b, s1, _ = x.shape # [b, s1, D] - _, s2, _ = y.shape # [b, s2, 1024] - - q = self.q_proj(x).view(b, s1, self.num_heads, self.head_dim) # [b, s1, h, d] - kv = self.kv_proj(y).view(b, s2, 2, self.num_heads, self.head_dim) # [b, s2, 2, h, d] - k, v = kv.unbind(dim=2) # [b, s2, h, d] - q = self.q_norm(q).half() # [b, s1, h, d] - k = self.k_norm(k).half() # [b, s2, h, d] - - # Apply RoPE if needed - if freqs_cis_img is not None: - qq, _ = apply_rotary_emb(q, None, freqs_cis_img) - assert qq.shape == q.shape, f'qq: {qq.shape}, q: {q.shape}' - q = qq # [b, s1, h, d] - kv = torch.stack([k, v], dim=2) # [b, s1, 2, h, d] - context = self.inner_attn(q, kv) # [b, s1, h, d] - context = context.view(b, s1, -1) # [b, s1, D] - - out = self.out_proj(context) - out = self.proj_drop(out) - - out_tuple = (out,) - - return out_tuple - - -class CrossAttention(nn.Module): - """ - Use QK Normalization. - """ - def __init__(self, - qdim, - kdim, - num_heads, - qkv_bias=True, - qk_norm=False, - attn_drop=0.0, - proj_drop=0.0, - device=None, - dtype=None, - norm_layer=nn.LayerNorm, - ): - factory_kwargs = {'device': device, 'dtype': dtype} - super().__init__() - self.qdim = qdim - self.kdim = kdim - self.num_heads = num_heads - assert self.qdim % num_heads == 0, "self.qdim must be divisible by num_heads" - self.head_dim = self.qdim // num_heads - assert self.head_dim % 8 == 0 and self.head_dim <= 128, "Only support head_dim <= 128 and divisible by 8" - self.scale = self.head_dim ** -0.5 - - self.q_proj = nn.Linear(qdim, qdim, bias=qkv_bias, **factory_kwargs) - self.kv_proj = nn.Linear(kdim, 2 * qdim, bias=qkv_bias, **factory_kwargs) - - # TODO: eps should be 1 / 65530 if using fp16 - self.q_norm = norm_layer(self.head_dim, elementwise_affine=True, eps=1e-6) if qk_norm else nn.Identity() - self.k_norm = norm_layer(self.head_dim, elementwise_affine=True, eps=1e-6) if qk_norm else nn.Identity() - self.attn_drop = nn.Dropout(attn_drop) - self.out_proj = nn.Linear(qdim, qdim, bias=qkv_bias, **factory_kwargs) - self.proj_drop = nn.Dropout(proj_drop) - - def forward(self, x, y, freqs_cis_img=None): - """ - Parameters - ---------- - x: torch.Tensor - (batch, seqlen1, hidden_dim) (where hidden_dim = num heads * head dim) - y: torch.Tensor - (batch, seqlen2, hidden_dim2) - freqs_cis_img: torch.Tensor - (batch, hidden_dim // 2), RoPE for image - """ - b, s1, c = x.shape # [b, s1, D] - _, s2, c = y.shape # [b, s2, 1024] - - q = self.q_proj(x).view(b, s1, self.num_heads, self.head_dim) # [b, s1, h, d] - kv = self.kv_proj(y).view(b, s2, 2, self.num_heads, self.head_dim) # [b, s2, 2, h, d] - k, v = kv.unbind(dim=2) # [b, s, h, d] - q = self.q_norm(q) - k = self.k_norm(k) - - # Apply RoPE if needed - if freqs_cis_img is not None: - qq, _ = apply_rotary_emb(q, None, freqs_cis_img) - assert qq.shape == q.shape, f'qq: {qq.shape}, q: {q.shape}' - q = qq - - q = q * self.scale - q = q.transpose(-2, -3).contiguous() # q -> B, L1, H, C - B, H, L1, C - k = k.permute(0, 2, 3, 1).contiguous() # k -> B, L2, H, C - B, H, C, L2 - attn = q @ k # attn -> B, H, L1, L2 - attn = attn.softmax(dim=-1) # attn -> B, H, L1, L2 - attn = self.attn_drop(attn) - x = attn @ v.transpose(-2, -3) # v -> B, L2, H, C - B, H, L2, C x-> B, H, L1, C - context = x.transpose(1, 2) # context -> B, H, L1, C - B, L1, H, C - - context = context.contiguous().view(b, s1, -1) - - out = self.out_proj(context) # context.reshape - B, L1, -1 - out = self.proj_drop(out) - - out_tuple = (out,) - - return out_tuple - - -class Attention(nn.Module): - """ - We rename some layer names to align with flash attention - """ - def __init__(self, dim, num_heads, qkv_bias=True, qk_norm=False, attn_drop=0., proj_drop=0., - norm_layer=nn.LayerNorm, - ): - super().__init__() - self.dim = dim - self.num_heads = num_heads - assert self.dim % num_heads == 0, 'dim should be divisible by num_heads' - self.head_dim = self.dim // num_heads - # This assertion is aligned with flash attention - assert self.head_dim % 8 == 0 and self.head_dim <= 128, "Only support head_dim <= 128 and divisible by 8" - self.scale = self.head_dim ** -0.5 - - # qkv --> Wqkv - self.Wqkv = nn.Linear(dim, dim * 3, bias=qkv_bias) - # TODO: eps should be 1 / 65530 if using fp16 - self.q_norm = norm_layer(self.head_dim, elementwise_affine=True, eps=1e-6) if qk_norm else nn.Identity() - self.k_norm = norm_layer(self.head_dim, elementwise_affine=True, eps=1e-6) if qk_norm else nn.Identity() - self.attn_drop = nn.Dropout(attn_drop) - self.out_proj = nn.Linear(dim, dim) - self.proj_drop = nn.Dropout(proj_drop) - - def forward(self, x, freqs_cis_img=None): - B, N, C = x.shape - qkv = self.Wqkv(x).reshape(B, N, 3, self.num_heads, self.head_dim).permute(2, 0, 3, 1, 4) # [3, b, h, s, d] - q, k, v = qkv.unbind(0) # [b, h, s, d] - q = self.q_norm(q) # [b, h, s, d] - k = self.k_norm(k) # [b, h, s, d] - - # Apply RoPE if needed - if freqs_cis_img is not None: - qq, kk = apply_rotary_emb(q, k, freqs_cis_img, head_first=True) - assert qq.shape == q.shape and kk.shape == k.shape, \ - f'qq: {qq.shape}, q: {q.shape}, kk: {kk.shape}, k: {k.shape}' - q, k = qq, kk - - # just use SDP here for now - x = torch.nn.functional.scaled_dot_product_attention( - q, k, v, - ).permute(0, 2, 1, 3).contiguous().reshape(B, N, C) - x = self.out_proj(x) - x = self.proj_drop(x) - - out_tuple = (x,) - - return out_tuple diff --git a/HunYuanDiT/models/embedders.py b/HunYuanDiT/models/embedders.py deleted file mode 100644 index 9fe08cb..0000000 --- a/HunYuanDiT/models/embedders.py +++ /dev/null @@ -1,111 +0,0 @@ -import math -import torch -import torch.nn as nn -from einops import repeat - -from timm.models.layers import to_2tuple - - -class PatchEmbed(nn.Module): - """ 2D Image to Patch Embedding - - Image to Patch Embedding using Conv2d - - A convolution based approach to patchifying a 2D image w/ embedding projection. - - Based on the impl in https://github.com/google-research/vision_transformer - - Hacked together by / Copyright 2020 Ross Wightman - - Remove the _assert function in forward function to be compatible with multi-resolution images. - """ - def __init__( - self, - img_size=224, - patch_size=16, - in_chans=3, - embed_dim=768, - norm_layer=None, - flatten=True, - bias=True, - ): - super().__init__() - if isinstance(img_size, int): - img_size = to_2tuple(img_size) - elif isinstance(img_size, (tuple, list)) and len(img_size) == 2: - img_size = tuple(img_size) - else: - raise ValueError(f"img_size must be int or tuple/list of length 2. Got {img_size}") - patch_size = to_2tuple(patch_size) - self.img_size = img_size - self.patch_size = patch_size - self.grid_size = (img_size[0] // patch_size[0], img_size[1] // patch_size[1]) - self.num_patches = self.grid_size[0] * self.grid_size[1] - self.flatten = flatten - - self.proj = nn.Conv2d(in_chans, embed_dim, kernel_size=patch_size, stride=patch_size, bias=bias) - self.norm = norm_layer(embed_dim) if norm_layer else nn.Identity() - - def update_image_size(self, img_size): - self.img_size = img_size - self.grid_size = (img_size[0] // self.patch_size[0], img_size[1] // self.patch_size[1]) - self.num_patches = self.grid_size[0] * self.grid_size[1] - - def forward(self, x): - # B, C, H, W = x.shape - # _assert(H == self.img_size[0], f"Input image height ({H}) doesn't match model ({self.img_size[0]}).") - # _assert(W == self.img_size[1], f"Input image width ({W}) doesn't match model ({self.img_size[1]}).") - x = self.proj(x) - if self.flatten: - x = x.flatten(2).transpose(1, 2) # BCHW -> BNC - x = self.norm(x) - return x - - -def timestep_embedding(t, dim, max_period=10000, repeat_only=False): - """ - Create sinusoidal timestep embeddings. - :param t: a 1-D Tensor of N indices, one per batch element. - These may be fractional. - :param dim: the dimension of the output. - :param max_period: controls the minimum frequency of the embeddings. - :return: an (N, D) Tensor of positional embeddings. - """ - # https://github.com/openai/glide-text2im/blob/main/glide_text2im/nn.py - if not repeat_only: - half = dim // 2 - freqs = torch.exp( - -math.log(max_period) - * torch.arange(start=0, end=half, dtype=torch.float32) - / half - ).to(device=t.device) # size: [dim/2], 一个指数衰减的曲线 - args = t[:, None].float() * freqs[None] - embedding = torch.cat([torch.cos(args), torch.sin(args)], dim=-1) - if dim % 2: - embedding = torch.cat( - [embedding, torch.zeros_like(embedding[:, :1])], dim=-1 - ) - else: - embedding = repeat(t, "b -> b d", d=dim) - return embedding - - -class TimestepEmbedder(nn.Module): - """ - Embeds scalar timesteps into vector representations. - """ - def __init__(self, hidden_size, frequency_embedding_size=256, out_size=None): - super().__init__() - if out_size is None: - out_size = hidden_size - self.mlp = nn.Sequential( - nn.Linear(frequency_embedding_size, hidden_size, bias=True), - nn.SiLU(), - nn.Linear(hidden_size, out_size, bias=True), - ) - self.frequency_embedding_size = frequency_embedding_size - - def forward(self, t): - t_freq = timestep_embedding(t, self.frequency_embedding_size).type(self.mlp[0].weight.dtype) - t_emb = self.mlp(t_freq) - return t_emb diff --git a/HunYuanDiT/models/models.py b/HunYuanDiT/models/models.py deleted file mode 100644 index 7dae413..0000000 --- a/HunYuanDiT/models/models.py +++ /dev/null @@ -1,439 +0,0 @@ -import torch -import torch.nn as nn -import torch.nn.functional as F -from timm.models.vision_transformer import Mlp - -from .attn_layers import Attention, FlashCrossMHAModified, FlashSelfMHAModified, CrossAttention -from .embedders import TimestepEmbedder, PatchEmbed, timestep_embedding -from .norm_layers import RMSNorm -from .poolers import AttentionPool -from .posemb_layers import get_2d_rotary_pos_embed, get_fill_resize_and_crop - -def modulate(x, shift, scale): - return x * (1 + scale.unsqueeze(1)) + shift.unsqueeze(1) - - -class FP32_Layernorm(nn.LayerNorm): - def forward(self, inputs: torch.Tensor) -> torch.Tensor: - origin_dtype = inputs.dtype - return F.layer_norm(inputs.float(), self.normalized_shape, self.weight.float(), self.bias.float(), - self.eps).to(origin_dtype) - - -class FP32_SiLU(nn.SiLU): - def forward(self, inputs: torch.Tensor) -> torch.Tensor: - return torch.nn.functional.silu(inputs.float(), inplace=False).to(inputs.dtype) - - -class HunYuanDiTBlock(nn.Module): - """ - A HunYuanDiT block with `add` conditioning. - """ - def __init__(self, - hidden_size, - c_emb_size, - num_heads, - mlp_ratio=4.0, - text_states_dim=1024, - use_flash_attn=False, - qk_norm=False, - norm_type="layer", - skip=False, - ): - super().__init__() - self.use_flash_attn = use_flash_attn - use_ele_affine = True - - if norm_type == "layer": - norm_layer = FP32_Layernorm - elif norm_type == "rms": - norm_layer = RMSNorm - else: - raise ValueError(f"Unknown norm_type: {norm_type}") - - # ========================= Self-Attention ========================= - self.norm1 = norm_layer(hidden_size, elementwise_affine=use_ele_affine, eps=1e-6) - if use_flash_attn: - self.attn1 = FlashSelfMHAModified(hidden_size, num_heads=num_heads, qkv_bias=True, qk_norm=qk_norm) - else: - self.attn1 = Attention(hidden_size, num_heads=num_heads, qkv_bias=True, qk_norm=qk_norm) - - # ========================= FFN ========================= - self.norm2 = norm_layer(hidden_size, elementwise_affine=use_ele_affine, eps=1e-6) - mlp_hidden_dim = int(hidden_size * mlp_ratio) - approx_gelu = lambda: nn.GELU(approximate="tanh") - self.mlp = Mlp(in_features=hidden_size, hidden_features=mlp_hidden_dim, act_layer=approx_gelu, drop=0) - - # ========================= Add ========================= - # Simply use add like SDXL. - self.default_modulation = nn.Sequential( - FP32_SiLU(), - nn.Linear(c_emb_size, hidden_size, bias=True) - ) - - # ========================= Cross-Attention ========================= - if use_flash_attn: - self.attn2 = FlashCrossMHAModified(hidden_size, text_states_dim, num_heads=num_heads, qkv_bias=True, - qk_norm=qk_norm) - else: - self.attn2 = CrossAttention(hidden_size, text_states_dim, num_heads=num_heads, qkv_bias=True, - qk_norm=qk_norm) - self.norm3 = norm_layer(hidden_size, elementwise_affine=True, eps=1e-6) - - # ========================= Skip Connection ========================= - if skip: - self.skip_norm = norm_layer(2 * hidden_size, elementwise_affine=True, eps=1e-6) - self.skip_linear = nn.Linear(2 * hidden_size, hidden_size) - else: - self.skip_linear = None - - def forward(self, x, c=None, text_states=None, freq_cis_img=None, skip=None): - # Long Skip Connection - if self.skip_linear is not None: - cat = torch.cat([x, skip], dim=-1) - cat = self.skip_norm(cat) - x = self.skip_linear(cat) - - # Self-Attention - shift_msa = self.default_modulation(c).unsqueeze(dim=1) - attn_inputs = ( - self.norm1(x) + shift_msa, freq_cis_img, - ) - x = x + self.attn1(*attn_inputs)[0] - - # Cross-Attention - cross_inputs = ( - self.norm3(x), text_states, freq_cis_img - ) - x = x + self.attn2(*cross_inputs)[0] - - # FFN Layer - mlp_inputs = self.norm2(x) - x = x + self.mlp(mlp_inputs) - - return x - - -class FinalLayer(nn.Module): - """ - The final layer of HunYuanDiT. - """ - def __init__(self, final_hidden_size, c_emb_size, patch_size, out_channels): - super().__init__() - self.norm_final = nn.LayerNorm(final_hidden_size, elementwise_affine=False, eps=1e-6) - self.linear = nn.Linear(final_hidden_size, patch_size * patch_size * out_channels, bias=True) - self.adaLN_modulation = nn.Sequential( - FP32_SiLU(), - nn.Linear(c_emb_size, 2 * final_hidden_size, bias=True) - ) - - def forward(self, x, c): - shift, scale = self.adaLN_modulation(c).chunk(2, dim=1) - x = modulate(self.norm_final(x), shift, scale) - x = self.linear(x) - return x - - -class HunYuanDiT(nn.Module): - """ - HunYuanDiT: Diffusion model with a Transformer backbone. - - Parameters - ---------- - args: argparse.Namespace - The arguments parsed by argparse. - input_size: tuple - The size of the input image. - patch_size: int - The size of the patch. - in_channels: int - The number of input channels. - hidden_size: int - The hidden size of the transformer backbone. - depth: int - The number of transformer blocks. - num_heads: int - The number of attention heads. - mlp_ratio: float - The ratio of the hidden size of the MLP in the transformer block. - log_fn: callable - The logging function. - """ - def __init__( - self, args, - input_size=(32, 32), - patch_size=2, - in_channels=4, - hidden_size=1152, - depth=28, - num_heads=16, - mlp_ratio=4.0, - log_fn=print, - cond_style=True, - cond_res=True, - **kwargs, - ): - super().__init__() - self.args = args - self.log_fn = log_fn - self.depth = depth - self.learn_sigma = args.learn_sigma - self.in_channels = in_channels - self.out_channels = in_channels * 2 if args.learn_sigma else in_channels - self.patch_size = patch_size - self.num_heads = num_heads - self.hidden_size = hidden_size - self.head_size = hidden_size // num_heads - self.text_states_dim = args.text_states_dim - self.text_states_dim_t5 = args.text_states_dim_t5 - self.text_len = args.text_len - self.text_len_t5 = args.text_len_t5 - self.norm = args.norm - self.cond_res = cond_res - self.cond_style = cond_style - - use_flash_attn = args.infer_mode == 'fa' - if use_flash_attn: - log_fn(f" Enable Flash Attention.") - qk_norm = True # See http://arxiv.org/abs/2302.05442 for details. - - self.mlp_t5 = nn.Sequential( - nn.Linear(self.text_states_dim_t5, self.text_states_dim_t5 * 4, bias=True), - FP32_SiLU(), - nn.Linear(self.text_states_dim_t5 * 4, self.text_states_dim, bias=True), - ) - # learnable replace - self.text_embedding_padding = nn.Parameter( - torch.randn(self.text_len + self.text_len_t5, self.text_states_dim, dtype=torch.float32)) - - # Attention pooling - self.pooler = AttentionPool(self.text_len_t5, self.text_states_dim_t5, num_heads=8, output_dim=1024) - - - self.extra_in_dim = 0 - if self.cond_res: - # Image size and crop size conditions - self.extra_in_dim += 256 * 6 - if self.cond_style: - # Here we use a default learned embedder layer for future extension. - self.style_embedder = nn.Embedding(1, hidden_size) - self.extra_in_dim += hidden_size - - # Text embedding for `add` - self.last_size = input_size - self.x_embedder = PatchEmbed(input_size, patch_size, in_channels, hidden_size) - self.t_embedder = TimestepEmbedder(hidden_size) - self.extra_in_dim += 1024 - self.extra_embedder = nn.Sequential( - nn.Linear(self.extra_in_dim, hidden_size * 4), - FP32_SiLU(), - nn.Linear(hidden_size * 4, hidden_size, bias=True), - ) - - # Image embedding - num_patches = self.x_embedder.num_patches - log_fn(f" Number of tokens: {num_patches}") - - # HUnYuanDiT Blocks - self.blocks = nn.ModuleList([ - HunYuanDiTBlock(hidden_size=hidden_size, - c_emb_size=hidden_size, - num_heads=num_heads, - mlp_ratio=mlp_ratio, - text_states_dim=self.text_states_dim, - use_flash_attn=use_flash_attn, - qk_norm=qk_norm, - norm_type=self.norm, - skip=layer > depth // 2, - ) - for layer in range(depth) - ]) - - self.final_layer = FinalLayer(hidden_size, hidden_size, patch_size, self.out_channels) - self.unpatchify_channels = self.out_channels - - def forward_raw(self, - x, - t, - encoder_hidden_states=None, - text_embedding_mask=None, - encoder_hidden_states_t5=None, - text_embedding_mask_t5=None, - image_meta_size=None, - style=None, - cos_cis_img=None, - sin_cis_img=None, - return_dict=False, - ): - """ - Forward pass of the encoder. - - Parameters - ---------- - x: torch.Tensor - (B, D, H, W) - t: torch.Tensor - (B) - encoder_hidden_states: torch.Tensor - CLIP text embedding, (B, L_clip, D) - text_embedding_mask: torch.Tensor - CLIP text embedding mask, (B, L_clip) - encoder_hidden_states_t5: torch.Tensor - T5 text embedding, (B, L_t5, D) - text_embedding_mask_t5: torch.Tensor - T5 text embedding mask, (B, L_t5) - image_meta_size: torch.Tensor - (B, 6) - style: torch.Tensor - (B) - cos_cis_img: torch.Tensor - sin_cis_img: torch.Tensor - return_dict: bool - Whether to return a dictionary. - """ - - text_states = encoder_hidden_states # 2,77,1024 - text_states_t5 = encoder_hidden_states_t5 # 2,256,2048 - text_states_mask = text_embedding_mask.bool() # 2,77 - text_states_t5_mask = text_embedding_mask_t5.bool() # 2,256 - b_t5, l_t5, c_t5 = text_states_t5.shape - text_states_t5 = self.mlp_t5(text_states_t5.view(-1, c_t5)) - text_states = torch.cat([text_states, text_states_t5.view(b_t5, l_t5, -1)], dim=1) # 2,205,1024 - clip_t5_mask = torch.cat([text_states_mask, text_states_t5_mask], dim=-1) - - clip_t5_mask = clip_t5_mask - text_states = torch.where(clip_t5_mask.unsqueeze(2), text_states, self.text_embedding_padding.to(text_states)) - - _, _, oh, ow = x.shape - th, tw = oh // self.patch_size, ow // self.patch_size - - # ========================= Build time and image embedding ========================= - t = self.t_embedder(t) - x = self.x_embedder(x) - - # Get image RoPE embedding according to `reso`lution. - freqs_cis_img = (cos_cis_img, sin_cis_img) - - # ========================= Concatenate all extra vectors ========================= - # Build text tokens with pooling - extra_vec = self.pooler(encoder_hidden_states_t5) - - if self.cond_res: - # Build image meta size tokens - image_meta_size = timestep_embedding(image_meta_size.view(-1), 256) # [B * 6, 256] - # if self.args.use_fp16: - # image_meta_size = image_meta_size.half() - - image_meta_size = image_meta_size.view(-1, 6 * 256) - extra_vec = torch.cat([extra_vec, image_meta_size], dim=1) # [B, D + 6 * 256] - - if self.cond_style: - # Build style tokens - style_embedding = self.style_embedder(style) - extra_vec = torch.cat([extra_vec, style_embedding], dim=1) - - # Concatenate all extra vectors - c = t + self.extra_embedder(extra_vec.to(self.dtype)) # [B, D] - - # ========================= Forward pass through HunYuanDiT blocks ========================= - skips = [] - for layer, block in enumerate(self.blocks): - if layer > self.depth // 2: - skip = skips.pop() - x = block(x, c, text_states, freqs_cis_img, skip) # (N, L, D) - else: - x = block(x, c, text_states, freqs_cis_img) # (N, L, D) - - if layer < (self.depth // 2 - 1): - skips.append(x) - - # ========================= Final layer ========================= - x = self.final_layer(x, c) # (N, L, patch_size ** 2 * out_channels) - x = self.unpatchify(x, th, tw) # (N, out_channels, H, W) - - if return_dict: - return {'x': x} - return x - - def calc_rope(self, height, width): - """ - Probably not the best in terms of perf to have this here - """ - th = height // 8 // self.patch_size - tw = width // 8 // self.patch_size - base_size = 512 // 8 // self.patch_size - start, stop = get_fill_resize_and_crop((th, tw), base_size) - sub_args = [start, stop, (th, tw)] - rope = get_2d_rotary_pos_embed(self.head_size, *sub_args) - return rope - - def forward(self, x, timesteps, context, context_mask=None, context_t5=None, context_t5_mask=None, src_size_cond=(1024,1024), **kwargs): - """ - Forward pass that adapts comfy input to original forward function - x: (N, C, H, W) tensor of spatial inputs (images or latent representations of images) - timesteps: (N,) tensor of diffusion timesteps - context: (N, 1, 77, C) CLIP conditioning - context_t5: (N, 1, 256, C) MT5 conditioning - """ - # context_mask = torch.zeros(x.shape[0], 77, device=x.device) - # context_t5_mask = torch.zeros(x.shape[0], 256, device=x.device) - - # style - style = torch.as_tensor([0] * (x.shape[0]), device=x.device) - - # image size - todo separate for cond/uncond when batched - if torch.is_tensor(src_size_cond): - src_size_cond = (int(src_size_cond[0][0]), int(src_size_cond[0][1])) - - image_size = (x.shape[2]//2*16, x.shape[3]//2*16) - size_cond = list(src_size_cond) + [image_size[1], image_size[0], 0, 0] - image_meta_size = torch.as_tensor([size_cond] * x.shape[0], device=x.device) - - # RoPE - rope = self.calc_rope(*image_size) - - # Update x_embedder if image size changed - if self.last_size != image_size: - from tqdm import tqdm - tqdm.write(f"HyDiT: New image size {image_size}") - self.x_embedder.update_image_size( - (image_size[0]//8, image_size[1]//8), - ) - self.last_size = image_size - - # Run original forward pass - out = self.forward_raw( - x = x.to(self.dtype), - t = timesteps.to(self.dtype), - encoder_hidden_states = context.to(self.dtype), - text_embedding_mask = context_mask.to(self.dtype), - encoder_hidden_states_t5 = context_t5.to(self.dtype), - text_embedding_mask_t5 = context_t5_mask.to(self.dtype), - image_meta_size = image_meta_size.to(self.dtype), - style = style, - cos_cis_img = rope[0], - sin_cis_img = rope[1], - ) - - # return - out = out.to(torch.float) - if self.learn_sigma: - eps, rest = out[:, :self.in_channels], out[:, self.in_channels:] - return eps - else: - return out - - def unpatchify(self, x, h, w): - """ - x: (N, T, patch_size**2 * C) - imgs: (N, H, W, C) - """ - c = self.unpatchify_channels - p = self.x_embedder.patch_size[0] - # h = w = int(x.shape[1] ** 0.5) - assert h * w == x.shape[1] - - x = x.reshape(shape=(x.shape[0], h, w, p, p, c)) - x = torch.einsum('nhwpqc->nchpwq', x) - imgs = x.reshape(shape=(x.shape[0], c, h * p, w * p)) - return imgs diff --git a/HunYuanDiT/models/norm_layers.py b/HunYuanDiT/models/norm_layers.py deleted file mode 100644 index 5204ad9..0000000 --- a/HunYuanDiT/models/norm_layers.py +++ /dev/null @@ -1,68 +0,0 @@ -import torch -import torch.nn as nn - - -class RMSNorm(nn.Module): - def __init__(self, dim: int, elementwise_affine=True, eps: float = 1e-6): - """ - Initialize the RMSNorm normalization layer. - - Args: - dim (int): The dimension of the input tensor. - eps (float, optional): A small value added to the denominator for numerical stability. Default is 1e-6. - - Attributes: - eps (float): A small value added to the denominator for numerical stability. - weight (nn.Parameter): Learnable scaling parameter. - - """ - super().__init__() - self.eps = eps - if elementwise_affine: - self.weight = nn.Parameter(torch.ones(dim)) - - def _norm(self, x): - """ - Apply the RMSNorm normalization to the input tensor. - - Args: - x (torch.Tensor): The input tensor. - - Returns: - torch.Tensor: The normalized tensor. - - """ - return x * torch.rsqrt(x.pow(2).mean(-1, keepdim=True) + self.eps) - - def forward(self, x): - """ - Forward pass through the RMSNorm layer. - - Args: - x (torch.Tensor): The input tensor. - - Returns: - torch.Tensor: The output tensor after applying RMSNorm. - - """ - output = self._norm(x.float()).type_as(x) - if hasattr(self, "weight"): - output = output * self.weight - return output - - -class GroupNorm32(nn.GroupNorm): - def __init__(self, num_groups, num_channels, eps=1e-5, dtype=None): - super().__init__(num_groups=num_groups, num_channels=num_channels, eps=eps, dtype=dtype) - - def forward(self, x): - y = super().forward(x).to(x.dtype) - return y - -def normalization(channels, dtype=None): - """ - Make a standard normalization layer. - :param channels: number of input channels. - :return: an nn.Module for normalization. - """ - return GroupNorm32(num_channels=channels, num_groups=32, dtype=dtype) diff --git a/HunYuanDiT/models/poolers.py b/HunYuanDiT/models/poolers.py deleted file mode 100644 index a4adcac..0000000 --- a/HunYuanDiT/models/poolers.py +++ /dev/null @@ -1,39 +0,0 @@ -import torch -import torch.nn as nn -import torch.nn.functional as F - - -class AttentionPool(nn.Module): - def __init__(self, spacial_dim: int, embed_dim: int, num_heads: int, output_dim: int = None): - super().__init__() - self.positional_embedding = nn.Parameter(torch.randn(spacial_dim + 1, embed_dim) / embed_dim ** 0.5) - self.k_proj = nn.Linear(embed_dim, embed_dim) - self.q_proj = nn.Linear(embed_dim, embed_dim) - self.v_proj = nn.Linear(embed_dim, embed_dim) - self.c_proj = nn.Linear(embed_dim, output_dim or embed_dim) - self.num_heads = num_heads - - def forward(self, x): - x = x.permute(1, 0, 2) # NLC -> LNC - x = torch.cat([x.mean(dim=0, keepdim=True), x], dim=0) # (L+1)NC - x = x + self.positional_embedding[:, None, :].to(x.dtype) # (L+1)NC - x, _ = F.multi_head_attention_forward( - query=x[:1], key=x, value=x, - embed_dim_to_check=x.shape[-1], - num_heads=self.num_heads, - q_proj_weight=self.q_proj.weight, - k_proj_weight=self.k_proj.weight, - v_proj_weight=self.v_proj.weight, - in_proj_weight=None, - in_proj_bias=torch.cat([self.q_proj.bias, self.k_proj.bias, self.v_proj.bias]), - bias_k=None, - bias_v=None, - add_zero_attn=False, - dropout_p=0, - out_proj_weight=self.c_proj.weight, - out_proj_bias=self.c_proj.bias, - use_separate_proj_weight=True, - training=self.training, - need_weights=False - ) - return x.squeeze(0) diff --git a/HunYuanDiT/models/posemb_layers.py b/HunYuanDiT/models/posemb_layers.py deleted file mode 100644 index 62c83df..0000000 --- a/HunYuanDiT/models/posemb_layers.py +++ /dev/null @@ -1,225 +0,0 @@ -import torch -import numpy as np -from typing import Union - - -def _to_tuple(x): - if isinstance(x, int): - return x, x - else: - return x - - -def get_fill_resize_and_crop(src, tgt): # src 来源的分辨率 tgt base 分辨率 - th, tw = _to_tuple(tgt) - h, w = _to_tuple(src) - - tr = th / tw # base 分辨率 - r = h / w # 目标分辨率 - - # resize - if r > tr: - resize_height = th - resize_width = int(round(th / h * w)) - else: - resize_width = tw - resize_height = int(round(tw / w * h)) # 根据base分辨率,将目标分辨率resize下来 - - crop_top = int(round((th - resize_height) / 2.0)) - crop_left = int(round((tw - resize_width) / 2.0)) - - return (crop_top, crop_left), (crop_top + resize_height, crop_left + resize_width) - - -def get_meshgrid(start, *args): - if len(args) == 0: - # start is grid_size - num = _to_tuple(start) - start = (0, 0) - stop = num - elif len(args) == 1: - # start is start, args[0] is stop, step is 1 - start = _to_tuple(start) - stop = _to_tuple(args[0]) - num = (stop[0] - start[0], stop[1] - start[1]) - elif len(args) == 2: - # start is start, args[0] is stop, args[1] is num - start = _to_tuple(start) # 左上角 eg: 12,0 - stop = _to_tuple(args[0]) # 右下角 eg: 20,32 - num = _to_tuple(args[1]) # 目标大小 eg: 32,124 - else: - raise ValueError(f"len(args) should be 0, 1 or 2, but got {len(args)}") - - grid_h = np.linspace(start[0], stop[0], num[0], endpoint=False, dtype=np.float32) # 12-20 中间差值32份 0-32 中间差值124份 - grid_w = np.linspace(start[1], stop[1], num[1], endpoint=False, dtype=np.float32) - grid = np.meshgrid(grid_w, grid_h) # here w goes first - grid = np.stack(grid, axis=0) # [2, W, H] - return grid - -################################################################################# -# Sine/Cosine Positional Embedding Functions # -################################################################################# -# https://github.com/facebookresearch/mae/blob/main/util/pos_embed.py - -def get_2d_sincos_pos_embed(embed_dim, start, *args, cls_token=False, extra_tokens=0): - """ - grid_size: int of the grid height and width - return: - pos_embed: [grid_size*grid_size, embed_dim] or [1+grid_size*grid_size, embed_dim] (w/ or w/o cls_token) - """ - grid = get_meshgrid(start, *args) # [2, H, w] - # grid_h = np.arange(grid_size, dtype=np.float32) - # grid_w = np.arange(grid_size, dtype=np.float32) - # grid = np.meshgrid(grid_w, grid_h) # here w goes first - # grid = np.stack(grid, axis=0) # [2, W, H] - - grid = grid.reshape([2, 1, *grid.shape[1:]]) - pos_embed = get_2d_sincos_pos_embed_from_grid(embed_dim, grid) - if cls_token and extra_tokens > 0: - pos_embed = np.concatenate([np.zeros([extra_tokens, embed_dim]), pos_embed], axis=0) - return pos_embed - - -def get_2d_sincos_pos_embed_from_grid(embed_dim, grid): - assert embed_dim % 2 == 0 - - # use half of dimensions to encode grid_h - emb_h = get_1d_sincos_pos_embed_from_grid(embed_dim // 2, grid[0]) # (H*W, D/2) - emb_w = get_1d_sincos_pos_embed_from_grid(embed_dim // 2, grid[1]) # (H*W, D/2) - - emb = np.concatenate([emb_h, emb_w], axis=1) # (H*W, D) - return emb - - -def get_1d_sincos_pos_embed_from_grid(embed_dim, pos): - """ - embed_dim: output dimension for each position - pos: a list of positions to be encoded: size (W,H) - out: (M, D) - """ - assert embed_dim % 2 == 0 - omega = np.arange(embed_dim // 2, dtype=np.float64) - omega /= embed_dim / 2. - omega = 1. / 10000**omega # (D/2,) - - pos = pos.reshape(-1) # (M,) - out = np.einsum('m,d->md', pos, omega) # (M, D/2), outer product - - emb_sin = np.sin(out) # (M, D/2) - emb_cos = np.cos(out) # (M, D/2) - - emb = np.concatenate([emb_sin, emb_cos], axis=1) # (M, D) - return emb - - -################################################################################# -# Rotary Positional Embedding Functions # -################################################################################# -# https://github.com/facebookresearch/llama/blob/main/llama/model.py#L443 - -def get_2d_rotary_pos_embed(embed_dim, start, *args, use_real=True): - """ - This is a 2d version of precompute_freqs_cis, which is a RoPE for image tokens with 2d structure. - - Parameters - ---------- - embed_dim: int - embedding dimension size - start: int or tuple of int - If len(args) == 0, start is num; If len(args) == 1, start is start, args[0] is stop, step is 1; - If len(args) == 2, start is start, args[0] is stop, args[1] is num. - use_real: bool - If True, return real part and imaginary part separately. Otherwise, return complex numbers. - - Returns - ------- - pos_embed: torch.Tensor - [HW, D/2] - """ - grid = get_meshgrid(start, *args) # [2, H, w] - grid = grid.reshape([2, 1, *grid.shape[1:]]) # 返回一个采样矩阵 分辨率与目标分辨率一致 - pos_embed = get_2d_rotary_pos_embed_from_grid(embed_dim, grid, use_real=use_real) - return pos_embed - - -def get_2d_rotary_pos_embed_from_grid(embed_dim, grid, use_real=False): - assert embed_dim % 4 == 0 - - # use half of dimensions to encode grid_h - emb_h = get_1d_rotary_pos_embed(embed_dim // 2, grid[0].reshape(-1), use_real=use_real) # (H*W, D/4) - emb_w = get_1d_rotary_pos_embed(embed_dim // 2, grid[1].reshape(-1), use_real=use_real) # (H*W, D/4) - - if use_real: - cos = torch.cat([emb_h[0], emb_w[0]], dim=1) # (H*W, D/2) - sin = torch.cat([emb_h[1], emb_w[1]], dim=1) # (H*W, D/2) - return cos, sin - else: - emb = torch.cat([emb_h, emb_w], dim=1) # (H*W, D/2) - return emb - - -def get_1d_rotary_pos_embed(dim: int, pos: Union[np.ndarray, int], theta: float = 10000.0, use_real=False): - """ - Precompute the frequency tensor for complex exponentials (cis) with given dimensions. - - This function calculates a frequency tensor with complex exponentials using the given dimension 'dim' - and the end index 'end'. The 'theta' parameter scales the frequencies. - The returned tensor contains complex values in complex64 data type. - - Args: - dim (int): Dimension of the frequency tensor. - pos (np.ndarray, int): Position indices for the frequency tensor. [S] or scalar - theta (float, optional): Scaling factor for frequency computation. Defaults to 10000.0. - use_real (bool, optional): If True, return real part and imaginary part separately. - Otherwise, return complex numbers. - - Returns: - torch.Tensor: Precomputed frequency tensor with complex exponentials. [S, D/2] - - """ - if isinstance(pos, int): - pos = np.arange(pos) - freqs = 1.0 / (theta ** (torch.arange(0, dim, 2)[: (dim // 2)].float() / dim)) # [D/2] - t = torch.from_numpy(pos).to(freqs.device) # type: ignore # [S] - freqs = torch.outer(t, freqs).float() # type: ignore # [S, D/2] - if use_real: - freqs_cos = freqs.cos().repeat_interleave(2, dim=1) # [S, D] - freqs_sin = freqs.sin().repeat_interleave(2, dim=1) # [S, D] - return freqs_cos, freqs_sin - else: - freqs_cis = torch.polar(torch.ones_like(freqs), freqs) # complex64 # [S, D/2] - return freqs_cis - - - -def calc_sizes(rope_img, patch_size, th, tw): - """ 计算 RoPE 的尺寸. """ - if rope_img == 'extend': - # 拓展模式 - sub_args = [(th, tw)] - elif rope_img.startswith('base'): - # 基于一个尺寸, 其他尺寸插值获得. - base_size = int(rope_img[4:]) // 8 // patch_size # 基于512作为base,其他根据512差值得到 - start, stop = get_fill_resize_and_crop((th, tw), base_size) # 需要在32x32里面 crop的左上角和右下角 - sub_args = [start, stop, (th, tw)] - else: - raise ValueError(f"Unknown rope_img: {rope_img}") - return sub_args - - -def init_image_posemb(rope_img, - resolutions, - patch_size, - hidden_size, - num_heads, - log_fn, - rope_real=True, - ): - freqs_cis_img = {} - for reso in resolutions: - th, tw = reso.height // 8 // patch_size, reso.width // 8 // patch_size - sub_args = calc_sizes(rope_img, patch_size, th, tw) # [左上角, 右下角, 目标高宽] 需要在32x32里面 crop的左上角和右下角 - freqs_cis_img[str(reso)] = get_2d_rotary_pos_embed(hidden_size // num_heads, *sub_args, use_real=rope_real) - log_fn(f" Using image RoPE ({rope_img}) ({'real' if rope_real else 'complex'}): {sub_args} | ({reso}) " - f"{freqs_cis_img[str(reso)][0].shape if rope_real else freqs_cis_img[str(reso)].shape}") - return freqs_cis_img diff --git a/HunYuanDiT/mt5_tokenizer/config.json b/HunYuanDiT/mt5_tokenizer/config.json deleted file mode 100644 index d5c7028..0000000 --- a/HunYuanDiT/mt5_tokenizer/config.json +++ /dev/null @@ -1,33 +0,0 @@ -{ - "_name_or_path": "mt5", - "architectures": [ - "MT5ForConditionalGeneration" - ], - "classifier_dropout": 0.0, - "d_ff": 5120, - "d_kv": 64, - "d_model": 2048, - "decoder_start_token_id": 0, - "dense_act_fn": "gelu_new", - "dropout_rate": 0.1, - "eos_token_id": 1, - "feed_forward_proj": "gated-gelu", - "initializer_factor": 1.0, - "is_encoder_decoder": true, - "is_gated_act": true, - "layer_norm_epsilon": 1e-06, - "model_type": "mt5", - "num_decoder_layers": 24, - "num_heads": 32, - "num_layers": 24, - "output_past": true, - "pad_token_id": 0, - "relative_attention_max_distance": 128, - "relative_attention_num_buckets": 32, - "tie_word_embeddings": false, - "tokenizer_class": "T5Tokenizer", - "torch_dtype": "float16", - "transformers_version": "4.40.2", - "use_cache": true, - "vocab_size": 250112 -} diff --git a/HunYuanDiT/mt5_tokenizer/special_tokens_map.json b/HunYuanDiT/mt5_tokenizer/special_tokens_map.json deleted file mode 100644 index 6dc4d43..0000000 --- a/HunYuanDiT/mt5_tokenizer/special_tokens_map.json +++ /dev/null @@ -1 +0,0 @@ -{"eos_token": "", "unk_token": "", "pad_token": ""} \ No newline at end of file diff --git a/HunYuanDiT/mt5_tokenizer/tokenizer_config.json b/HunYuanDiT/mt5_tokenizer/tokenizer_config.json deleted file mode 100644 index 712e82c..0000000 --- a/HunYuanDiT/mt5_tokenizer/tokenizer_config.json +++ /dev/null @@ -1 +0,0 @@ -{"eos_token": "", "unk_token": "", "pad_token": "", "extra_ids": 0, "additional_special_tokens": null, "special_tokens_map_file": "/home/patrick/.cache/torch/transformers/685ac0ca8568ec593a48b61b0a3c272beee9bc194a3c7241d15dcadb5f875e53.f76030f3ec1b96a8199b2593390c610e76ca8028ef3d24680000619ffb646276", "tokenizer_file": null, "name_or_path": "google/mt5-small"} \ No newline at end of file diff --git a/HunYuanDiT/nodes.py b/HunYuanDiT/nodes.py deleted file mode 100644 index dbce038..0000000 --- a/HunYuanDiT/nodes.py +++ /dev/null @@ -1,198 +0,0 @@ -import os -import folder_paths -from copy import deepcopy - -from .conf import hydit_conf -from .loader import load_hydit - -class HYDiTCheckpointLoader: - @classmethod - def INPUT_TYPES(s): - return { - "required": { - "ckpt_name": (folder_paths.get_filename_list("checkpoints"),), - "model": (list(hydit_conf.keys()),{"default":"G/2"}), - } - } - RETURN_TYPES = ("MODEL",) - RETURN_NAMES = ("model",) - FUNCTION = "load_checkpoint" - CATEGORY = "ExtraModels/HunyuanDiT" - TITLE = "Hunyuan DiT Checkpoint Loader" - - def load_checkpoint(self, ckpt_name, model): - ckpt_path = folder_paths.get_full_path("checkpoints", ckpt_name) - model_conf = hydit_conf[model] - model = load_hydit( - model_path = ckpt_path, - model_conf = model_conf, - ) - return (model,) - -#### temp stuff for the text encoder #### -import torch -from .tenc import load_clip, load_t5 -from ..utils.dtype import string_to_dtype -dtypes = [ - "default", - "auto (comfy)", - "FP32", - "FP16", - "BF16" -] - -class HYDiTTextEncoderLoader: - @classmethod - def INPUT_TYPES(s): - devices = ["auto", "cpu", "gpu"] - # hack for using second GPU as offload - for k in range(1, torch.cuda.device_count()): - devices.append(f"cuda:{k}") - return { - "required": { - "clip_name": (folder_paths.get_filename_list("clip"),), - "mt5_name": (folder_paths.get_filename_list("t5"),), - "device": (devices, {"default":"cpu"}), - "dtype": (dtypes,), - } - } - - RETURN_TYPES = ("CLIP", "T5") - FUNCTION = "load_model" - CATEGORY = "ExtraModels/HunyuanDiT" - TITLE = "Hunyuan DiT Text Encoder Loader" - - def load_model(self, clip_name, mt5_name, device, dtype): - dtype = string_to_dtype(dtype, "text_encoder") - if device == "cpu": - assert dtype in [None, torch.float32, torch.bfloat16], f"Can't use dtype '{dtype}' with CPU! Set dtype to 'default' or 'bf16'." - - clip = load_clip( - model_path = folder_paths.get_full_path("clip", clip_name), - device = device, - dtype = dtype, - ) - t5 = load_t5( - model_path = folder_paths.get_full_path("t5", mt5_name), - device = device, - dtype = dtype, - ) - return(clip, t5) - -class HYDiTTextEncode: - @classmethod - def INPUT_TYPES(s): - return { - "required": { - "text": ("STRING", {"multiline": True}), - "text_t5": ("STRING", {"multiline": True}), - "CLIP": ("CLIP",), - "T5": ("T5",), - } - } - - RETURN_TYPES = ("CONDITIONING",) - FUNCTION = "encode" - CATEGORY = "ExtraModels/HunyuanDiT" - TITLE = "Hunyuan DiT Text Encode" - - def encode(self, text, text_t5, CLIP, T5): - # T5 - T5.load_model() - t5_pre = T5.tokenizer( - text_t5, - max_length = T5.cond_stage_model.max_length, - padding = 'max_length', - truncation = True, - return_attention_mask = True, - add_special_tokens = True, - return_tensors = 'pt' - ) - t5_mask = t5_pre["attention_mask"] - with torch.no_grad(): - t5_outs = T5.cond_stage_model.transformer( - input_ids = t5_pre["input_ids"].to(T5.load_device), - attention_mask = t5_mask.to(T5.load_device), - output_hidden_states = True, - ) - # to-do: replace -1 for clip skip - t5_embs = t5_outs["hidden_states"][-1].float().cpu() - - # "clip" - CLIP.load_model() - clip_pre = CLIP.tokenizer( - text, - max_length = CLIP.cond_stage_model.max_length, - padding = 'max_length', - truncation = True, - return_attention_mask = True, - add_special_tokens = True, - return_tensors = 'pt' - ) - clip_mask = clip_pre["attention_mask"] - with torch.no_grad(): - clip_outs = CLIP.cond_stage_model.transformer( - input_ids = clip_pre["input_ids"].to(CLIP.load_device), - attention_mask = clip_mask.to(CLIP.load_device), - ) - # to-do: add hidden states - clip_embs = clip_outs[0].float().cpu() - - # combined cond - return ([[ - clip_embs, { - "context_t5": t5_embs, - "context_mask": clip_mask.float(), - "context_t5_mask": t5_mask.float() - } - ]],) - -class HYDiTTextEncodeSimple(HYDiTTextEncode): - @classmethod - def INPUT_TYPES(s): - return { - "required": { - "text": ("STRING", {"multiline": True}), - "CLIP": ("CLIP",), - "T5": ("T5",), - } - } - - FUNCTION = "encode_simple" - TITLE = "Hunyuan DiT Text Encode (simple)" - - def encode_simple(self, text, **args): - return self.encode(text=text, text_t5=text, **args) - -class HYDiTSrcSizeCond: - @classmethod - def INPUT_TYPES(s): - return { - "required": { - "cond": ("CONDITIONING", ), - "width": ("INT", {"default": 1024.0, "min": 0, "max": 8192, "step": 16}), - "height": ("INT", {"default": 1024.0, "min": 0, "max": 8192, "step": 16}), - } - } - - RETURN_TYPES = ("CONDITIONING",) - RETURN_NAMES = ("cond",) - FUNCTION = "add_cond" - CATEGORY = "ExtraModels/HunyuanDiT" - TITLE = "Hunyuan DiT Size Conditioning (advanced)" - - def add_cond(self, cond, width, height): - cond = deepcopy(cond) - for c in range(len(cond)): - cond[c][1].update({ - "src_size_cond": [[height, width]], - }) - return (cond,) - -NODE_CLASS_MAPPINGS = { - "HYDiTCheckpointLoader": HYDiTCheckpointLoader, - "HYDiTTextEncoderLoader": HYDiTTextEncoderLoader, - "HYDiTTextEncode": HYDiTTextEncode, - "HYDiTTextEncodeSimple": HYDiTTextEncodeSimple, - "HYDiTSrcSizeCond": HYDiTSrcSizeCond, -} diff --git a/HunYuanDiT/tenc.py b/HunYuanDiT/tenc.py deleted file mode 100644 index a2d7107..0000000 --- a/HunYuanDiT/tenc.py +++ /dev/null @@ -1,180 +0,0 @@ -# This is for loading the CLIP (bert?) + mT5 encoder for HunYuanDiT -import os -import torch -from transformers import AutoTokenizer, modeling_utils -from transformers import T5Config, T5EncoderModel, BertConfig, BertModel - -from comfy import model_management -import comfy.model_patcher -import comfy.utils - -class mT5Model(torch.nn.Module): - def __init__(self, textmodel_json_config=None, device="cpu", max_length=256, freeze=True, dtype=None): - super().__init__() - self.device = device - self.dtype = dtype - self.max_length = max_length - if textmodel_json_config is None: - textmodel_json_config = os.path.join( - os.path.dirname(os.path.realpath(__file__)), - f"config_mt5.json" - ) - config = T5Config.from_json_file(textmodel_json_config) - with modeling_utils.no_init_weights(): - self.transformer = T5EncoderModel(config) - self.to(dtype) - if freeze: - self.freeze() - - def freeze(self): - self.transformer = self.transformer.eval() - for param in self.parameters(): - param.requires_grad = False - - def load_sd(self, sd): - return self.transformer.load_state_dict(sd, strict=False) - - def to(self, *args, **kwargs): - return self.transformer.to(*args, **kwargs) - -class hyCLIPModel(torch.nn.Module): - def __init__(self, textmodel_json_config=None, device="cpu", max_length=77, freeze=True, dtype=None): - super().__init__() - self.device = device - self.dtype = dtype - self.max_length = max_length - if textmodel_json_config is None: - textmodel_json_config = os.path.join( - os.path.dirname(os.path.realpath(__file__)), - f"config_clip.json" - ) - config = BertConfig.from_json_file(textmodel_json_config) - with modeling_utils.no_init_weights(): - self.transformer = BertModel(config) - self.to(dtype) - if freeze: - self.freeze() - - def freeze(self): - self.transformer = self.transformer.eval() - for param in self.parameters(): - param.requires_grad = False - - def load_sd(self, sd): - return self.transformer.load_state_dict(sd, strict=False) - - def to(self, *args, **kwargs): - return self.transformer.to(*args, **kwargs) - -class EXM_HyDiT_Tenc_Temp: - def __init__(self, no_init=False, device="cpu", dtype=None, model_class="mT5", *kwargs): - if no_init: - return - - size = 8 if model_class == "mT5" else 2 - if dtype == torch.float32: - size *= 2 - size *= (1024**3) - - if device == "auto": - self.load_device = model_management.text_encoder_device() - self.offload_device = model_management.text_encoder_offload_device() - self.init_device = "cpu" - elif device == "cpu": - size = 0 # doesn't matter - self.load_device = "cpu" - self.offload_device = "cpu" - self.init_device="cpu" - elif device.startswith("cuda"): - print("Direct CUDA device override!\nVRAM will not be freed by default.") - size = 0 # not used - self.load_device = device - self.offload_device = device - self.init_device = device - else: - self.load_device = model_management.get_torch_device() - self.offload_device = "cpu" - self.init_device="cpu" - - self.dtype = dtype - self.device = self.load_device - if model_class == "mT5": - self.cond_stage_model = mT5Model( - device = self.load_device, - dtype = self.dtype, - ) - tokenizer_args = {"subfolder": "t2i/mt5"} # web - tokenizer_path = os.path.join( # local - os.path.dirname(os.path.realpath(__file__)), - "mt5_tokenizer", - ) - else: - self.cond_stage_model = hyCLIPModel( - device = self.load_device, - dtype = self.dtype, - ) - tokenizer_args = {"subfolder": "t2i/tokenizer",} # web - tokenizer_path = os.path.join( # local - os.path.dirname(os.path.realpath(__file__)), - "tokenizer", - ) - # self.tokenizer = AutoTokenizer.from_pretrained( - # "Tencent-Hunyuan/HunyuanDiT", - # **tokenizer_args - # ) - self.tokenizer = AutoTokenizer.from_pretrained(tokenizer_path) - self.patcher = comfy.model_patcher.ModelPatcher( - self.cond_stage_model, - load_device = self.load_device, - offload_device = self.offload_device, - size = size, - ) - - def clone(self): - n = EXM_HyDiT_Tenc_Temp(no_init=True) - n.patcher = self.patcher.clone() - n.cond_stage_model = self.cond_stage_model - n.tokenizer = self.tokenizer - return n - - def load_sd(self, sd): - return self.cond_stage_model.load_sd(sd) - - def get_sd(self): - return self.cond_stage_model.state_dict() - - def load_model(self): - if self.load_device != "cpu": - model_management.load_model_gpu(self.patcher) - return self.patcher - - def add_patches(self, patches, strength_patch=1.0, strength_model=1.0): - return self.patcher.add_patches(patches, strength_patch, strength_model) - - def get_key_patches(self): - return self.patcher.get_key_patches() - -def load_clip(model_path, **kwargs): - model = EXM_HyDiT_Tenc_Temp(model_class="clip", **kwargs) - sd = comfy.utils.load_torch_file(model_path) - - prefix = "bert." - state_dict = {} - for key in sd: - nkey = key - if key.startswith(prefix): - nkey = key[len(prefix):] - state_dict[nkey] = sd[key] - - m, e = model.load_sd(state_dict) - if len(m) > 0 or len(e) > 0: - print(f"HYDiT: clip missing {len(m)} keys ({len(e)} extra)") - return model - -def load_t5(model_path, **kwargs): - model = EXM_HyDiT_Tenc_Temp(model_class="mT5", **kwargs) - sd = comfy.utils.load_torch_file(model_path) - m, e = model.load_sd(sd) - if len(m) > 0 or len(e) > 0: - print(f"HYDiT: mT5 missing {len(m)} keys ({len(e)} extra)") - return model diff --git a/HunYuanDiT/tokenizer/config.json b/HunYuanDiT/tokenizer/config.json deleted file mode 100644 index f629874..0000000 --- a/HunYuanDiT/tokenizer/config.json +++ /dev/null @@ -1,34 +0,0 @@ -{ - "_name_or_path": "hfl/chinese-roberta-wwm-ext-large", - "architectures": [ - "BertModel" - ], - "attention_probs_dropout_prob": 0.1, - "bos_token_id": 0, - "classifier_dropout": null, - "directionality": "bidi", - "eos_token_id": 2, - "hidden_act": "gelu", - "hidden_dropout_prob": 0.1, - "hidden_size": 1024, - "initializer_range": 0.02, - "intermediate_size": 4096, - "layer_norm_eps": 1e-12, - "max_position_embeddings": 512, - "model_type": "bert", - "num_attention_heads": 16, - "num_hidden_layers": 24, - "output_past": true, - "pad_token_id": 0, - "pooler_fc_size": 768, - "pooler_num_attention_heads": 12, - "pooler_num_fc_layers": 3, - "pooler_size_per_head": 128, - "pooler_type": "first_token_transform", - "position_embedding_type": "absolute", - "torch_dtype": "float32", - "transformers_version": "4.22.1", - "type_vocab_size": 2, - "use_cache": true, - "vocab_size": 47020 -} diff --git a/HunYuanDiT/tokenizer/special_tokens_map.json b/HunYuanDiT/tokenizer/special_tokens_map.json deleted file mode 100644 index a8b3208..0000000 --- a/HunYuanDiT/tokenizer/special_tokens_map.json +++ /dev/null @@ -1,7 +0,0 @@ -{ - "cls_token": "[CLS]", - "mask_token": "[MASK]", - "pad_token": "[PAD]", - "sep_token": "[SEP]", - "unk_token": "[UNK]" -} diff --git a/HunYuanDiT/tokenizer/tokenizer_config.json b/HunYuanDiT/tokenizer/tokenizer_config.json deleted file mode 100644 index a143560..0000000 --- a/HunYuanDiT/tokenizer/tokenizer_config.json +++ /dev/null @@ -1,16 +0,0 @@ -{ - "cls_token": "[CLS]", - "do_basic_tokenize": true, - "do_lower_case": true, - "mask_token": "[MASK]", - "name_or_path": "hfl/chinese-roberta-wwm-ext", - "never_split": null, - "pad_token": "[PAD]", - "sep_token": "[SEP]", - "special_tokens_map_file": "/home/chenweifeng/.cache/huggingface/hub/models--hfl--chinese-roberta-wwm-ext/snapshots/5c58d0b8ec1d9014354d691c538661bf00bfdb44/special_tokens_map.json", - "strip_accents": null, - "tokenize_chinese_chars": true, - "tokenizer_class": "BertTokenizer", - "unk_token": "[UNK]", - "model_max_length": 77 -} diff --git a/HunYuanDiT/tokenizer/vocab.txt b/HunYuanDiT/tokenizer/vocab.txt deleted file mode 100644 index 6246906..0000000 --- a/HunYuanDiT/tokenizer/vocab.txt +++ /dev/null @@ -1,47020 +0,0 @@ -[PAD] -[unused1] -[unused2] -[unused3] -[unused4] -[unused5] -[unused6] -[unused7] -[unused8] -[unused9] -[unused10] -[unused11] -[unused12] -[unused13] -[unused14] -[unused15] -[unused16] -[unused17] -[unused18] -[unused19] -[unused20] -[unused21] -[unused22] -[unused23] -[unused24] -[unused25] -[unused26] -[unused27] -[unused28] -[unused29] -[unused30] -[unused31] -[unused32] -[unused33] -[unused34] -[unused35] -[unused36] -[unused37] -[unused38] -[unused39] -[unused40] -[unused41] -[unused42] -[unused43] -[unused44] -[unused45] -[unused46] -[unused47] -[unused48] -[unused49] -[unused50] -[unused51] -[unused52] -[unused53] -[unused54] -[unused55] -[unused56] -[unused57] -[unused58] -[unused59] -[unused60] -[unused61] -[unused62] -[unused63] -[unused64] -[unused65] -[unused66] -[unused67] -[unused68] -[unused69] -[unused70] -[unused71] -[unused72] -[unused73] -[unused74] -[unused75] -[unused76] -[unused77] -[unused78] -[unused79] -[unused80] -[unused81] -[unused82] -[unused83] -[unused84] -[unused85] -[unused86] -[unused87] -[unused88] -[unused89] -[unused90] -[unused91] -[unused92] -[unused93] -[unused94] -[unused95] -[unused96] -[unused97] -[unused98] -[unused99] -[UNK] -[CLS] -[SEP] -[MASK] - - -! -" -# -$ -% -& -' -( -) -* -+ -, -- -. -/ -0 -1 -2 -3 -4 -5 -6 -7 -8 -9 -: -; -< -= -> -? -@ -[ -\ -] -^ -_ -a -b -c -d -e -f -g -h -i -j -k -l -m -n -o -p -q -r -s -t -u -v -w -x -y -z -{ -| -} -~ -£ -¤ -¥ -§ -© -« -® -° -± -² -³ -µ -· -¹ -º -» -¼ -× -ß -æ -÷ -ø -đ -ŋ -ɔ -ə -ɡ -ʰ -ˇ -ˈ -ˊ -ˋ -ˍ -ː -˙ -˚ -ˢ -α -β -γ -δ -ε -η -θ -ι -κ -λ -μ -ν -ο -π -ρ -ς -σ -τ -υ -φ -χ -ψ -ω -а -б -в -г -д -е -ж -з -и -к -л -м -н -о -п -р -с -т -у -ф -х -ц -ч -ш -ы -ь -я -і -ا -ب -ة -ت -د -ر -س -ع -ل -م -ن -ه -و -ي -۩ -ก -ง -น -ม -ย -ร -อ -า -เ -๑ -་ -ღ -ᄀ -ᄁ -ᄂ -ᄃ -ᄅ -ᄆ -ᄇ -ᄈ -ᄉ -ᄋ -ᄌ -ᄎ -ᄏ -ᄐ -ᄑ -ᄒ -ᅡ -ᅢ -ᅣ -ᅥ -ᅦ -ᅧ -ᅨ -ᅩ -ᅪ -ᅬ -ᅭ -ᅮ -ᅯ -ᅲ -ᅳ -ᅴ -ᅵ -ᆨ -ᆫ -ᆯ -ᆷ -ᆸ -ᆺ -ᆻ -ᆼ -ᗜ -ᵃ -ᵉ -ᵍ -ᵏ -ᵐ -ᵒ -ᵘ -‖ -„ -† -• -‥ -‧ -
 -‰ -′ -″ -‹ -› -※ -‿ -⁄ -ⁱ -⁺ -ⁿ -₁ -₂ -₃ -₄ -€ -℃ -№ -™ -ⅰ -ⅱ -ⅲ -ⅳ -ⅴ -← -↑ -→ -↓ -↔ -↗ -↘ -⇒ -∀ -− -∕ -∙ -√ -∞ -∟ -∠ -∣ -∥ -∩ -∮ -∶ -∼ -∽ -≈ -≒ -≡ -≤ -≥ -≦ -≧ -≪ -≫ -⊙ -⋅ -⋈ -⋯ -⌒ -① -② -③ -④ -⑤ -⑥ -⑦ -⑧ -⑨ -⑩ -⑴ -⑵ -⑶ -⑷ -⑸ -⒈ -⒉ -⒊ -⒋ -ⓒ -ⓔ -ⓘ -─ -━ -│ -┃ -┅ -┆ -┊ -┌ -└ -├ -┣ -═ -║ -╚ -╞ -╠ -╭ -╮ -╯ -╰ -╱ -╳ -▂ -▃ -▅ -▇ -█ -▉ -▋ -▌ -▍ -▎ -■ -□ -▪ -▫ -▬ -▲ -△ -▶ -► -▼ -▽ -◆ -◇ -○ -◎ -● -◕ -◠ -◢ -◤ -☀ -★ -☆ -☕ -☞ -☺ -☼ -♀ -♂ -♠ -♡ -♣ -♥ -♦ -♪ -♫ -♬ -✈ -✔ -✕ -✖ -✦ -✨ -✪ -✰ -✿ -❀ -❤ -➜ -➤ -⦿ -、 -。 -〃 -々 -〇 -〈 -〉 -《 -》 -「 -」 -『 -』 -【 -】 -〓 -〔 -〕 -〖 -〗 -〜 -〝 -〞 -ぁ -あ -ぃ -い -う -ぇ -え -お -か -き -く -け -こ -さ -し -す -せ -そ -た -ち -っ -つ -て -と -な -に -ぬ -ね -の -は -ひ -ふ -へ -ほ -ま -み -む -め -も -ゃ -や -ゅ -ゆ -ょ -よ -ら -り -る -れ -ろ -わ -を -ん -゜ -ゝ -ァ -ア -ィ -イ -ゥ -ウ -ェ -エ -ォ -オ -カ -キ -ク -ケ -コ -サ -シ -ス -セ -ソ -タ -チ -ッ -ツ -テ -ト -ナ -ニ -ヌ -ネ -ノ -ハ -ヒ -フ -ヘ -ホ -マ -ミ -ム -メ -モ -ャ -ヤ -ュ -ユ -ョ -ヨ -ラ -リ -ル -レ -ロ -ワ -ヲ -ン -ヶ -・ -ー -ヽ -ㄅ -ㄆ -ㄇ -ㄉ -ㄋ -ㄌ -ㄍ -ㄎ -ㄏ -ㄒ -ㄚ -ㄛ -ㄞ -ㄟ -ㄢ -ㄤ -ㄥ -ㄧ -ㄨ -ㆍ -㈦ -㊣ -㎡ -㗎 -一 -丁 -七 -万 -丈 -三 -上 -下 -不 -与 -丐 -丑 -专 -且 -丕 -世 -丘 -丙 -业 -丛 -东 -丝 -丞 -丟 -両 -丢 -两 -严 -並 -丧 -丨 -个 -丫 -中 -丰 -串 -临 -丶 -丸 -丹 -为 -主 -丼 -丽 -举 -丿 -乂 -乃 -久 -么 -义 -之 -乌 -乍 -乎 -乏 -乐 -乒 -乓 -乔 -乖 -乗 -乘 -乙 -乜 -九 -乞 -也 -习 -乡 -书 -乩 -买 -乱 -乳 -乾 -亀 -亂 -了 -予 -争 -事 -二 -于 -亏 -云 -互 -五 -井 -亘 -亙 -亚 -些 -亜 -亞 -亟 -亡 -亢 -交 -亥 -亦 -产 -亨 -亩 -享 -京 -亭 -亮 -亲 -亳 -亵 -人 -亿 -什 -仁 -仃 -仄 -仅 -仆 -仇 -今 -介 -仍 -从 -仏 -仑 -仓 -仔 -仕 -他 -仗 -付 -仙 -仝 -仞 -仟 -代 -令 -以 -仨 -仪 -们 -仮 -仰 -仲 -件 -价 -任 -份 -仿 -企 -伉 -伊 -伍 -伎 -伏 -伐 -休 -伕 -众 -优 -伙 -会 -伝 -伞 -伟 -传 -伢 -伤 -伦 -伪 -伫 -伯 -估 -伴 -伶 -伸 -伺 -似 -伽 -佃 -但 -佇 -佈 -位 -低 -住 -佐 -佑 -体 -佔 -何 -佗 -佘 -余 -佚 -佛 -作 -佝 -佞 -佟 -你 -佢 -佣 -佤 -佥 -佩 -佬 -佯 -佰 -佳 -併 -佶 -佻 -佼 -使 -侃 -侄 -來 -侈 -例 -侍 -侏 -侑 -侖 -侗 -供 -依 -侠 -価 -侣 -侥 -侦 -侧 -侨 -侬 -侮 -侯 -侵 -侶 -侷 -便 -係 -促 -俄 -俊 -俎 -俏 -俐 -俑 -俗 -俘 -俚 -保 -俞 -俟 -俠 -信 -俨 -俩 -俪 -俬 -俭 -修 -俯 -俱 -俳 -俸 -俺 -俾 -倆 -倉 -個 -倌 -倍 -倏 -們 -倒 -倔 -倖 -倘 -候 -倚 -倜 -借 -倡 -値 -倦 -倩 -倪 -倫 -倬 -倭 -倶 -债 -值 -倾 -偃 -假 -偈 -偉 -偌 -偎 -偏 -偕 -做 -停 -健 -側 -偵 -偶 -偷 -偻 -偽 -偿 -傀 -傅 -傍 -傑 -傘 -備 -傚 -傢 -傣 -傥 -储 -傩 -催 -傭 -傲 -傳 -債 -傷 -傻 -傾 -僅 -働 -像 -僑 -僕 -僖 -僚 -僥 -僧 -僭 -僮 -僱 -僵 -價 -僻 -儀 -儂 -億 -儆 -儉 -儋 -儒 -儕 -儘 -償 -儡 -優 -儲 -儷 -儼 -儿 -兀 -允 -元 -兄 -充 -兆 -兇 -先 -光 -克 -兌 -免 -児 -兑 -兒 -兔 -兖 -党 -兜 -兢 -入 -內 -全 -兩 -八 -公 -六 -兮 -兰 -共 -兲 -关 -兴 -兵 -其 -具 -典 -兹 -养 -兼 -兽 -冀 -内 -円 -冇 -冈 -冉 -冊 -册 -再 -冏 -冒 -冕 -冗 -写 -军 -农 -冠 -冢 -冤 -冥 -冨 -冪 -冬 -冯 -冰 -冲 -决 -况 -冶 -冷 -冻 -冼 -冽 -冾 -净 -凄 -准 -凇 -凈 -凉 -凋 -凌 -凍 -减 -凑 -凛 -凜 -凝 -几 -凡 -凤 -処 -凪 -凭 -凯 -凰 -凱 -凳 -凶 -凸 -凹 -出 -击 -函 -凿 -刀 -刁 -刃 -分 -切 -刈 -刊 -刍 -刎 -刑 -划 -列 -刘 -则 -刚 -创 -初 -删 -判 -別 -刨 -利 -刪 -别 -刮 -到 -制 -刷 -券 -刹 -刺 -刻 -刽 -剁 -剂 -剃 -則 -剉 -削 -剋 -剌 -前 -剎 -剐 -剑 -剔 -剖 -剛 -剜 -剝 -剣 -剤 -剥 -剧 -剩 -剪 -副 -割 -創 -剷 -剽 -剿 -劃 -劇 -劈 -劉 -劊 -劍 -劏 -劑 -力 -劝 -办 -功 -加 -务 -劣 -动 -助 -努 -劫 -劭 -励 -劲 -劳 -労 -劵 -効 -劾 -势 -勁 -勃 -勇 -勉 -勋 -勐 -勒 -動 -勖 -勘 -務 -勛 -勝 -勞 -募 -勢 -勤 -勧 -勳 -勵 -勸 -勺 -勻 -勾 -勿 -匀 -包 -匆 -匈 -匍 -匐 -匕 -化 -北 -匙 -匝 -匠 -匡 -匣 -匪 -匮 -匯 -匱 -匹 -区 -医 -匾 -匿 -區 -十 -千 -卅 -升 -午 -卉 -半 -卍 -华 -协 -卑 -卒 -卓 -協 -单 -卖 -南 -単 -博 -卜 -卞 -卟 -占 -卡 -卢 -卤 -卦 -卧 -卫 -卮 -卯 -印 -危 -即 -却 -卵 -卷 -卸 -卻 -卿 -厂 -厄 -厅 -历 -厉 -压 -厌 -厕 -厘 -厚 -厝 -原 -厢 -厥 -厦 -厨 -厩 -厭 -厮 -厲 -厳 -去 -县 -叁 -参 -參 -又 -叉 -及 -友 -双 -反 -収 -发 -叔 -取 -受 -变 -叙 -叛 -叟 -叠 -叡 -叢 -口 -古 -句 -另 -叨 -叩 -只 -叫 -召 -叭 -叮 -可 -台 -叱 -史 -右 -叵 -叶 -号 -司 -叹 -叻 -叼 -叽 -吁 -吃 -各 -吆 -合 -吉 -吊 -吋 -同 -名 -后 -吏 -吐 -向 -吒 -吓 -吕 -吖 -吗 -君 -吝 -吞 -吟 -吠 -吡 -否 -吧 -吨 -吩 -含 -听 -吭 -吮 -启 -吱 -吳 -吴 -吵 -吶 -吸 -吹 -吻 -吼 -吽 -吾 -呀 -呂 -呃 -呆 -呈 -告 -呋 -呎 -呐 -呓 -呕 -呗 -员 -呛 -呜 -呢 -呤 -呦 -周 -呱 -呲 -味 -呵 -呷 -呸 -呻 -呼 -命 -咀 -咁 -咂 -咄 -咆 -咋 -和 -咎 -咏 -咐 -咒 -咔 -咕 -咖 -咗 -咘 -咙 -咚 -咛 -咣 -咤 -咦 -咧 -咨 -咩 -咪 -咫 -咬 -咭 -咯 -咱 -咲 -咳 -咸 -咻 -咽 -咿 -哀 -品 -哂 -哄 -哆 -哇 -哈 -哉 -哋 -哌 -响 -哎 -哏 -哐 -哑 -哒 -哔 -哗 -哟 -員 -哥 -哦 -哧 -哨 -哩 -哪 -哭 -哮 -哲 -哺 -哼 -哽 -唁 -唄 -唆 -唇 -唉 -唏 -唐 -唑 -唔 -唠 -唤 -唧 -唬 -售 -唯 -唰 -唱 -唳 -唷 -唸 -唾 -啃 -啄 -商 -啉 -啊 -問 -啓 -啕 -啖 -啜 -啞 -啟 -啡 -啤 -啥 -啦 -啧 -啪 -啫 -啬 -啮 -啰 -啱 -啲 -啵 -啶 -啷 -啸 -啻 -啼 -啾 -喀 -喂 -喃 -善 -喆 -喇 -喉 -喊 -喋 -喎 -喏 -喔 -喘 -喙 -喚 -喜 -喝 -喟 -喧 -喪 -喫 -喬 -單 -喰 -喱 -喲 -喳 -喵 -営 -喷 -喹 -喺 -喻 -喽 -嗅 -嗆 -嗇 -嗎 -嗑 -嗒 -嗓 -嗔 -嗖 -嗚 -嗜 -嗝 -嗟 -嗡 -嗣 -嗤 -嗦 -嗨 -嗪 -嗬 -嗯 -嗰 -嗲 -嗳 -嗶 -嗷 -嗽 -嘀 -嘅 -嘆 -嘈 -嘉 -嘌 -嘍 -嘎 -嘔 -嘖 -嘗 -嘘 -嘚 -嘛 -嘜 -嘞 -嘟 -嘢 -嘣 -嘤 -嘧 -嘩 -嘭 -嘮 -嘯 -嘰 -嘱 -嘲 -嘴 -嘶 -嘸 -嘹 -嘻 -嘿 -噁 -噌 -噎 -噓 -噔 -噗 -噙 -噜 -噠 -噢 -噤 -器 -噩 -噪 -噬 -噱 -噴 -噶 -噸 -噹 -噻 -噼 -嚀 -嚇 -嚎 -嚏 -嚐 -嚓 -嚕 -嚟 -嚣 -嚥 -嚨 -嚮 -嚴 -嚷 -嚼 -囂 -囉 -囊 -囍 -囑 -囔 -囗 -囚 -四 -囝 -回 -囟 -因 -囡 -团 -団 -囤 -囧 -囪 -囫 -园 -困 -囱 -囲 -図 -围 -囹 -固 -国 -图 -囿 -圃 -圄 -圆 -圈 -國 -圍 -圏 -園 -圓 -圖 -團 -圜 -土 -圣 -圧 -在 -圩 -圭 -地 -圳 -场 -圻 -圾 -址 -坂 -均 -坊 -坍 -坎 -坏 -坐 -坑 -块 -坚 -坛 -坝 -坞 -坟 -坠 -坡 -坤 -坦 -坨 -坪 -坯 -坳 -坵 -坷 -垂 -垃 -垄 -型 -垒 -垚 -垛 -垠 -垢 -垣 -垦 -垩 -垫 -垭 -垮 -垵 -埂 -埃 -埋 -城 -埔 -埕 -埗 -域 -埠 -埤 -埵 -執 -埸 -培 -基 -埼 -堀 -堂 -堃 -堅 -堆 -堇 -堑 -堕 -堙 -堡 -堤 -堪 -堯 -堰 -報 -場 -堵 -堺 -堿 -塊 -塌 -塑 -塔 -塗 -塘 -塚 -塞 -塢 -塩 -填 -塬 -塭 -塵 -塾 -墀 -境 -墅 -墉 -墊 -墒 -墓 -増 -墘 -墙 -墜 -增 -墟 -墨 -墩 -墮 -墳 -墻 -墾 -壁 -壅 -壆 -壇 -壊 -壑 -壓 -壕 -壘 -壞 -壟 -壢 -壤 -壩 -士 -壬 -壮 -壯 -声 -売 -壳 -壶 -壹 -壺 -壽 -处 -备 -変 -复 -夏 -夔 -夕 -外 -夙 -多 -夜 -够 -夠 -夢 -夥 -大 -天 -太 -夫 -夭 -央 -夯 -失 -头 -夷 -夸 -夹 -夺 -夾 -奂 -奄 -奇 -奈 -奉 -奋 -奎 -奏 -奐 -契 -奔 -奕 -奖 -套 -奘 -奚 -奠 -奢 -奥 -奧 -奪 -奬 -奮 -女 -奴 -奶 -奸 -她 -好 -如 -妃 -妄 -妆 -妇 -妈 -妊 -妍 -妒 -妓 -妖 -妘 -妙 -妝 -妞 -妣 -妤 -妥 -妨 -妩 -妪 -妮 -妲 -妳 -妹 -妻 -妾 -姆 -姉 -姊 -始 -姍 -姐 -姑 -姒 -姓 -委 -姗 -姚 -姜 -姝 -姣 -姥 -姦 -姨 -姪 -姫 -姬 -姹 -姻 -姿 -威 -娃 -娄 -娅 -娆 -娇 -娉 -娑 -娓 -娘 -娛 -娜 -娟 -娠 -娣 -娥 -娩 -娱 -娲 -娴 -娶 -娼 -婀 -婁 -婆 -婉 -婊 -婕 -婚 -婢 -婦 -婧 -婪 -婭 -婴 -婵 -婶 -婷 -婺 -婿 -媒 -媚 -媛 -媞 -媧 -媲 -媳 -媽 -媾 -嫁 -嫂 -嫉 -嫌 -嫑 -嫔 -嫖 -嫘 -嫚 -嫡 -嫣 -嫦 -嫩 -嫲 -嫵 -嫻 -嬅 -嬉 -嬌 -嬗 -嬛 -嬢 -嬤 -嬪 -嬰 -嬴 -嬷 -嬸 -嬿 -孀 -孃 -子 -孑 -孔 -孕 -孖 -字 -存 -孙 -孚 -孛 -孜 -孝 -孟 -孢 -季 -孤 -学 -孩 -孪 -孫 -孬 -孰 -孱 -孳 -孵 -學 -孺 -孽 -孿 -宁 -它 -宅 -宇 -守 -安 -宋 -完 -宏 -宓 -宕 -宗 -官 -宙 -定 -宛 -宜 -宝 -实 -実 -宠 -审 -客 -宣 -室 -宥 -宦 -宪 -宫 -宮 -宰 -害 -宴 -宵 -家 -宸 -容 -宽 -宾 -宿 -寂 -寄 -寅 -密 -寇 -富 -寐 -寒 -寓 -寛 -寝 -寞 -察 -寡 -寢 -寥 -實 -寧 -寨 -審 -寫 -寬 -寮 -寰 -寵 -寶 -寸 -对 -寺 -寻 -导 -対 -寿 -封 -専 -射 -将 -將 -專 -尉 -尊 -尋 -對 -導 -小 -少 -尔 -尕 -尖 -尘 -尚 -尝 -尤 -尧 -尬 -就 -尴 -尷 -尸 -尹 -尺 -尻 -尼 -尽 -尾 -尿 -局 -屁 -层 -屄 -居 -屆 -屈 -屉 -届 -屋 -屌 -屍 -屎 -屏 -屐 -屑 -展 -屜 -属 -屠 -屡 -屢 -層 -履 -屬 -屯 -山 -屹 -屿 -岀 -岁 -岂 -岌 -岐 -岑 -岔 -岖 -岗 -岘 -岙 -岚 -岛 -岡 -岩 -岫 -岬 -岭 -岱 -岳 -岷 -岸 -峇 -峋 -峒 -峙 -峡 -峤 -峥 -峦 -峨 -峪 -峭 -峯 -峰 -峴 -島 -峻 -峽 -崁 -崂 -崆 -崇 -崎 -崑 -崔 -崖 -崗 -崙 -崛 -崧 -崩 -崭 -崴 -崽 -嵇 -嵊 -嵋 -嵌 -嵐 -嵘 -嵩 -嵬 -嵯 -嶂 -嶄 -嶇 -嶋 -嶙 -嶺 -嶼 -嶽 -巅 -巍 -巒 -巔 -巖 -川 -州 -巡 -巢 -工 -左 -巧 -巨 -巩 -巫 -差 -己 -已 -巳 -巴 -巷 -巻 -巽 -巾 -巿 -币 -市 -布 -帅 -帆 -师 -希 -帐 -帑 -帕 -帖 -帘 -帚 -帛 -帜 -帝 -帥 -带 -帧 -師 -席 -帮 -帯 -帰 -帳 -帶 -帷 -常 -帼 -帽 -幀 -幂 -幄 -幅 -幌 -幔 -幕 -幟 -幡 -幢 -幣 -幫 -干 -平 -年 -并 -幸 -幹 -幺 -幻 -幼 -幽 -幾 -广 -庁 -広 -庄 -庆 -庇 -床 -序 -庐 -库 -应 -底 -庖 -店 -庙 -庚 -府 -庞 -废 -庠 -度 -座 -庫 -庭 -庵 -庶 -康 -庸 -庹 -庾 -廁 -廂 -廃 -廈 -廉 -廊 -廓 -廖 -廚 -廝 -廟 -廠 -廢 -廣 -廬 -廳 -延 -廷 -建 -廿 -开 -弁 -异 -弃 -弄 -弈 -弊 -弋 -式 -弑 -弒 -弓 -弔 -引 -弗 -弘 -弛 -弟 -张 -弥 -弦 -弧 -弩 -弭 -弯 -弱 -張 -強 -弹 -强 -弼 -弾 -彅 -彆 -彈 -彌 -彎 -归 -当 -录 -彗 -彙 -彝 -形 -彤 -彥 -彦 -彧 -彩 -彪 -彫 -彬 -彭 -彰 -影 -彷 -役 -彻 -彼 -彿 -往 -征 -径 -待 -徇 -很 -徉 -徊 -律 -後 -徐 -徑 -徒 -従 -徕 -得 -徘 -徙 -徜 -從 -徠 -御 -徨 -復 -循 -徬 -微 -徳 -徴 -徵 -德 -徹 -徼 -徽 -心 -必 -忆 -忌 -忍 -忏 -忐 -忑 -忒 -忖 -志 -忘 -忙 -応 -忠 -忡 -忤 -忧 -忪 -快 -忱 -念 -忻 -忽 -忿 -怀 -态 -怂 -怅 -怆 -怎 -怏 -怒 -怔 -怕 -怖 -怙 -怜 -思 -怠 -怡 -急 -怦 -性 -怨 -怪 -怯 -怵 -总 -怼 -恁 -恃 -恆 -恋 -恍 -恐 -恒 -恕 -恙 -恚 -恢 -恣 -恤 -恥 -恨 -恩 -恪 -恫 -恬 -恭 -息 -恰 -恳 -恵 -恶 -恸 -恺 -恻 -恼 -恿 -悄 -悅 -悉 -悌 -悍 -悔 -悖 -悚 -悟 -悠 -患 -悦 -您 -悩 -悪 -悬 -悯 -悱 -悲 -悴 -悵 -悶 -悸 -悻 -悼 -悽 -情 -惆 -惇 -惊 -惋 -惑 -惕 -惘 -惚 -惜 -惟 -惠 -惡 -惦 -惧 -惨 -惩 -惫 -惬 -惭 -惮 -惯 -惰 -惱 -想 -惴 -惶 -惹 -惺 -愁 -愆 -愈 -愉 -愍 -意 -愕 -愚 -愛 -愜 -感 -愣 -愤 -愧 -愫 -愷 -愿 -慄 -慈 -態 -慌 -慎 -慑 -慕 -慘 -慚 -慟 -慢 -慣 -慧 -慨 -慫 -慮 -慰 -慳 -慵 -慶 -慷 -慾 -憂 -憊 -憋 -憎 -憐 -憑 -憔 -憚 -憤 -憧 -憨 -憩 -憫 -憬 -憲 -憶 -憾 -懂 -懇 -懈 -應 -懊 -懋 -懑 -懒 -懦 -懲 -懵 -懶 -懷 -懸 -懺 -懼 -懾 -懿 -戀 -戈 -戊 -戌 -戍 -戎 -戏 -成 -我 -戒 -戕 -或 -战 -戚 -戛 -戟 -戡 -戦 -截 -戬 -戮 -戰 -戲 -戳 -戴 -戶 -户 -戸 -戻 -戾 -房 -所 -扁 -扇 -扈 -扉 -手 -才 -扎 -扑 -扒 -打 -扔 -払 -托 -扛 -扣 -扦 -执 -扩 -扪 -扫 -扬 -扭 -扮 -扯 -扰 -扱 -扳 -扶 -批 -扼 -找 -承 -技 -抄 -抉 -把 -抑 -抒 -抓 -投 -抖 -抗 -折 -抚 -抛 -抜 -択 -抟 -抠 -抡 -抢 -护 -报 -抨 -披 -抬 -抱 -抵 -抹 -押 -抽 -抿 -拂 -拄 -担 -拆 -拇 -拈 -拉 -拋 -拌 -拍 -拎 -拐 -拒 -拓 -拔 -拖 -拗 -拘 -拙 -拚 -招 -拜 -拟 -拡 -拢 -拣 -拥 -拦 -拧 -拨 -择 -括 -拭 -拮 -拯 -拱 -拳 -拴 -拷 -拼 -拽 -拾 -拿 -持 -挂 -指 -挈 -按 -挎 -挑 -挖 -挙 -挚 -挛 -挝 -挞 -挟 -挠 -挡 -挣 -挤 -挥 -挨 -挪 -挫 -振 -挲 -挹 -挺 -挽 -挾 -捂 -捅 -捆 -捉 -捋 -捌 -捍 -捎 -捏 -捐 -捕 -捞 -损 -捡 -换 -捣 -捧 -捨 -捩 -据 -捱 -捲 -捶 -捷 -捺 -捻 -掀 -掂 -掃 -掇 -授 -掉 -掌 -掏 -掐 -排 -掖 -掘 -掙 -掛 -掠 -採 -探 -掣 -接 -控 -推 -掩 -措 -掬 -掰 -掲 -掳 -掴 -掷 -掸 -掺 -揀 -揃 -揄 -揆 -揉 -揍 -描 -提 -插 -揖 -揚 -換 -握 -揣 -揩 -揪 -揭 -揮 -援 -揶 -揸 -揹 -揽 -搀 -搁 -搂 -搅 -損 -搏 -搐 -搓 -搔 -搖 -搗 -搜 -搞 -搡 -搪 -搬 -搭 -搵 -搶 -携 -搽 -摀 -摁 -摄 -摆 -摇 -摈 -摊 -摒 -摔 -摘 -摞 -摟 -摧 -摩 -摯 -摳 -摸 -摹 -摺 -摻 -撂 -撃 -撅 -撇 -撈 -撐 -撑 -撒 -撓 -撕 -撚 -撞 -撤 -撥 -撩 -撫 -撬 -播 -撮 -撰 -撲 -撵 -撷 -撸 -撻 -撼 -撿 -擀 -擁 -擂 -擄 -擅 -擇 -擊 -擋 -操 -擎 -擒 -擔 -擘 -據 -擞 -擠 -擡 -擢 -擦 -擬 -擰 -擱 -擲 -擴 -擷 -擺 -擼 -擾 -攀 -攏 -攒 -攔 -攘 -攙 -攜 -攝 -攞 -攢 -攣 -攤 -攥 -攪 -攫 -攬 -支 -收 -攸 -改 -攻 -放 -政 -故 -效 -敌 -敍 -敎 -敏 -救 -敕 -敖 -敗 -敘 -教 -敛 -敝 -敞 -敢 -散 -敦 -敬 -数 -敲 -整 -敵 -敷 -數 -斂 -斃 -文 -斋 -斌 -斎 -斐 -斑 -斓 -斗 -料 -斛 -斜 -斟 -斡 -斤 -斥 -斧 -斩 -斫 -斬 -断 -斯 -新 -斷 -方 -於 -施 -旁 -旃 -旅 -旋 -旌 -旎 -族 -旖 -旗 -无 -既 -日 -旦 -旧 -旨 -早 -旬 -旭 -旮 -旱 -时 -旷 -旺 -旻 -昀 -昂 -昆 -昇 -昉 -昊 -昌 -明 -昏 -易 -昔 -昕 -昙 -星 -映 -春 -昧 -昨 -昭 -是 -昱 -昴 -昵 -昶 -昼 -显 -晁 -時 -晃 -晉 -晋 -晌 -晏 -晒 -晓 -晔 -晕 -晖 -晗 -晚 -晝 -晞 -晟 -晤 -晦 -晨 -晩 -普 -景 -晰 -晴 -晶 -晷 -智 -晾 -暂 -暄 -暇 -暈 -暉 -暌 -暐 -暑 -暖 -暗 -暝 -暢 -暧 -暨 -暫 -暮 -暱 -暴 -暸 -暹 -曄 -曆 -曇 -曉 -曖 -曙 -曜 -曝 -曠 -曦 -曬 -曰 -曲 -曳 -更 -書 -曹 -曼 -曾 -替 -最 -會 -月 -有 -朋 -服 -朐 -朔 -朕 -朗 -望 -朝 -期 -朦 -朧 -木 -未 -末 -本 -札 -朮 -术 -朱 -朴 -朵 -机 -朽 -杀 -杂 -权 -杆 -杈 -杉 -李 -杏 -材 -村 -杓 -杖 -杜 -杞 -束 -杠 -条 -来 -杨 -杭 -杯 -杰 -東 -杳 -杵 -杷 -杼 -松 -板 -极 -构 -枇 -枉 -枋 -析 -枕 -林 -枚 -果 -枝 -枢 -枣 -枪 -枫 -枭 -枯 -枰 -枱 -枳 -架 -枷 -枸 -柄 -柏 -某 -柑 -柒 -染 -柔 -柘 -柚 -柜 -柞 -柠 -柢 -查 -柩 -柬 -柯 -柱 -柳 -柴 -柵 -査 -柿 -栀 -栃 -栄 -栅 -标 -栈 -栉 -栋 -栎 -栏 -树 -栓 -栖 -栗 -校 -栩 -株 -样 -核 -根 -格 -栽 -栾 -桀 -桁 -桂 -桃 -桅 -框 -案 -桉 -桌 -桎 -桐 -桑 -桓 -桔 -桜 -桠 -桡 -桢 -档 -桥 -桦 -桧 -桨 -桩 -桶 -桿 -梁 -梅 -梆 -梏 -梓 -梗 -條 -梟 -梢 -梦 -梧 -梨 -梭 -梯 -械 -梳 -梵 -梶 -检 -棂 -棄 -棉 -棋 -棍 -棒 -棕 -棗 -棘 -棚 -棟 -棠 -棣 -棧 -森 -棱 -棲 -棵 -棹 -棺 -椁 -椅 -椋 -植 -椎 -椒 -検 -椪 -椭 -椰 -椹 -椽 -椿 -楂 -楊 -楓 -楔 -楚 -楝 -楞 -楠 -楣 -楨 -楫 -業 -楮 -極 -楷 -楸 -楹 -楼 -楽 -概 -榄 -榆 -榈 -榉 -榔 -榕 -榖 -榛 -榜 -榨 -榫 -榭 -榮 -榱 -榴 -榷 -榻 -槁 -槃 -構 -槌 -槍 -槎 -槐 -槓 -様 -槛 -槟 -槤 -槭 -槲 -槳 -槻 -槽 -槿 -樁 -樂 -樊 -樑 -樓 -標 -樞 -樟 -模 -樣 -権 -横 -樫 -樯 -樱 -樵 -樸 -樹 -樺 -樽 -樾 -橄 -橇 -橋 -橐 -橘 -橙 -機 -橡 -橢 -橫 -橱 -橹 -橼 -檀 -檄 -檎 -檐 -檔 -檗 -檜 -檢 -檬 -檯 -檳 -檸 -檻 -櫃 -櫚 -櫛 -櫥 -櫸 -櫻 -欄 -權 -欒 -欖 -欠 -次 -欢 -欣 -欧 -欲 -欸 -欺 -欽 -款 -歆 -歇 -歉 -歌 -歎 -歐 -歓 -歙 -歛 -歡 -止 -正 -此 -步 -武 -歧 -歩 -歪 -歯 -歲 -歳 -歴 -歷 -歸 -歹 -死 -歼 -殁 -殃 -殆 -殇 -殉 -殊 -残 -殒 -殓 -殖 -殘 -殞 -殡 -殤 -殭 -殯 -殲 -殴 -段 -殷 -殺 -殼 -殿 -毀 -毁 -毂 -毅 -毆 -毋 -母 -毎 -每 -毒 -毓 -比 -毕 -毗 -毘 -毙 -毛 -毡 -毫 -毯 -毽 -氈 -氏 -氐 -民 -氓 -气 -氖 -気 -氙 -氛 -氟 -氡 -氢 -氣 -氤 -氦 -氧 -氨 -氪 -氫 -氮 -氯 -氰 -氲 -水 -氷 -永 -氹 -氾 -汀 -汁 -求 -汆 -汇 -汉 -汎 -汐 -汕 -汗 -汙 -汛 -汝 -汞 -江 -池 -污 -汤 -汨 -汩 -汪 -汰 -汲 -汴 -汶 -汹 -決 -汽 -汾 -沁 -沂 -沃 -沅 -沈 -沉 -沌 -沏 -沐 -沒 -沓 -沖 -沙 -沛 -沟 -没 -沢 -沣 -沥 -沦 -沧 -沪 -沫 -沭 -沮 -沱 -河 -沸 -油 -治 -沼 -沽 -沾 -沿 -況 -泄 -泉 -泊 -泌 -泓 -法 -泗 -泛 -泞 -泠 -泡 -波 -泣 -泥 -注 -泪 -泫 -泮 -泯 -泰 -泱 -泳 -泵 -泷 -泸 -泻 -泼 -泽 -泾 -洁 -洄 -洋 -洒 -洗 -洙 -洛 -洞 -津 -洩 -洪 -洮 -洱 -洲 -洵 -洶 -洸 -洹 -活 -洼 -洽 -派 -流 -浃 -浄 -浅 -浆 -浇 -浊 -测 -济 -浏 -浑 -浒 -浓 -浔 -浙 -浚 -浜 -浣 -浦 -浩 -浪 -浬 -浮 -浯 -浴 -海 -浸 -涂 -涅 -涇 -消 -涉 -涌 -涎 -涓 -涔 -涕 -涙 -涛 -涝 -涞 -涟 -涠 -涡 -涣 -涤 -润 -涧 -涨 -涩 -涪 -涮 -涯 -液 -涵 -涸 -涼 -涿 -淀 -淄 -淅 -淆 -淇 -淋 -淌 -淑 -淒 -淖 -淘 -淙 -淚 -淞 -淡 -淤 -淦 -淨 -淩 -淪 -淫 -淬 -淮 -深 -淳 -淵 -混 -淹 -淺 -添 -淼 -清 -済 -渉 -渊 -渋 -渍 -渎 -渐 -渔 -渗 -渙 -渚 -減 -渝 -渠 -渡 -渣 -渤 -渥 -渦 -温 -測 -渭 -港 -渲 -渴 -游 -渺 -渾 -湃 -湄 -湊 -湍 -湖 -湘 -湛 -湟 -湧 -湫 -湮 -湯 -湳 -湾 -湿 -満 -溃 -溅 -溉 -溏 -源 -準 -溜 -溝 -溟 -溢 -溥 -溧 -溪 -溫 -溯 -溱 -溴 -溶 -溺 -溼 -滁 -滂 -滄 -滅 -滇 -滋 -滌 -滑 -滓 -滔 -滕 -滙 -滚 -滝 -滞 -滟 -满 -滢 -滤 -滥 -滦 -滨 -滩 -滬 -滯 -滲 -滴 -滷 -滸 -滾 -滿 -漁 -漂 -漆 -漉 -漏 -漓 -演 -漕 -漠 -漢 -漣 -漩 -漪 -漫 -漬 -漯 -漱 -漲 -漳 -漸 -漾 -漿 -潆 -潇 -潋 -潍 -潑 -潔 -潘 -潛 -潜 -潞 -潟 -潢 -潤 -潦 -潧 -潭 -潮 -潰 -潴 -潸 -潺 -潼 -澀 -澄 -澆 -澈 -澍 -澎 -澗 -澜 -澡 -澤 -澧 -澱 -澳 -澹 -激 -濁 -濂 -濃 -濑 -濒 -濕 -濘 -濛 -濟 -濠 -濡 -濤 -濫 -濬 -濮 -濯 -濱 -濺 -濾 -瀅 -瀆 -瀉 -瀋 -瀏 -瀑 -瀕 -瀘 -瀚 -瀛 -瀝 -瀞 -瀟 -瀧 -瀨 -瀬 -瀰 -瀾 -灌 -灏 -灑 -灘 -灝 -灞 -灣 -火 -灬 -灭 -灯 -灰 -灵 -灶 -灸 -灼 -災 -灾 -灿 -炀 -炁 -炅 -炉 -炊 -炎 -炒 -炔 -炕 -炖 -炙 -炜 -炫 -炬 -炭 -炮 -炯 -炳 -炷 -炸 -点 -為 -炼 -炽 -烁 -烂 -烃 -烈 -烊 -烏 -烘 -烙 -烛 -烟 -烤 -烦 -烧 -烨 -烩 -烫 -烬 -热 -烯 -烷 -烹 -烽 -焉 -焊 -焕 -焖 -焗 -焘 -焙 -焚 -焜 -無 -焦 -焯 -焰 -焱 -然 -焼 -煅 -煉 -煊 -煌 -煎 -煒 -煖 -煙 -煜 -煞 -煤 -煥 -煦 -照 -煨 -煩 -煮 -煲 -煸 -煽 -熄 -熊 -熏 -熒 -熔 -熙 -熟 -熠 -熨 -熬 -熱 -熵 -熹 -熾 -燁 -燃 -燄 -燈 -燉 -燊 -燎 -燒 -燔 -燕 -燙 -燜 -營 -燥 -燦 -燧 -燭 -燮 -燴 -燻 -燼 -燿 -爆 -爍 -爐 -爛 -爪 -爬 -爭 -爰 -爱 -爲 -爵 -父 -爷 -爸 -爹 -爺 -爻 -爽 -爾 -牆 -片 -版 -牌 -牍 -牒 -牙 -牛 -牝 -牟 -牠 -牡 -牢 -牦 -牧 -物 -牯 -牲 -牴 -牵 -特 -牺 -牽 -犀 -犁 -犄 -犊 -犍 -犒 -犢 -犧 -犬 -犯 -状 -犷 -犸 -犹 -狀 -狂 -狄 -狈 -狎 -狐 -狒 -狗 -狙 -狞 -狠 -狡 -狩 -独 -狭 -狮 -狰 -狱 -狸 -狹 -狼 -狽 -猎 -猕 -猖 -猗 -猙 -猛 -猜 -猝 -猥 -猩 -猪 -猫 -猬 -献 -猴 -猶 -猷 -猾 -猿 -獄 -獅 -獎 -獐 -獒 -獗 -獠 -獣 -獨 -獭 -獰 -獲 -獵 -獷 -獸 -獺 -獻 -獼 -獾 -玄 -率 -玉 -王 -玑 -玖 -玛 -玟 -玠 -玥 -玩 -玫 -玮 -环 -现 -玲 -玳 -玷 -玺 -玻 -珀 -珂 -珅 -珈 -珉 -珊 -珍 -珏 -珐 -珑 -珙 -珞 -珠 -珣 -珥 -珩 -珪 -班 -珮 -珲 -珺 -現 -球 -琅 -理 -琇 -琉 -琊 -琍 -琏 -琐 -琛 -琢 -琥 -琦 -琨 -琪 -琬 -琮 -琰 -琲 -琳 -琴 -琵 -琶 -琺 -琼 -瑀 -瑁 -瑄 -瑋 -瑕 -瑗 -瑙 -瑚 -瑛 -瑜 -瑞 -瑟 -瑠 -瑣 -瑤 -瑩 -瑪 -瑯 -瑰 -瑶 -瑾 -璀 -璁 -璃 -璇 -璉 -璋 -璎 -璐 -璜 -璞 -璟 -璧 -璨 -環 -璽 -璿 -瓊 -瓏 -瓒 -瓜 -瓢 -瓣 -瓤 -瓦 -瓮 -瓯 -瓴 -瓶 -瓷 -甄 -甌 -甕 -甘 -甙 -甚 -甜 -生 -產 -産 -甥 -甦 -用 -甩 -甫 -甬 -甭 -甯 -田 -由 -甲 -申 -电 -男 -甸 -町 -画 -甾 -畀 -畅 -界 -畏 -畑 -畔 -留 -畜 -畝 -畢 -略 -畦 -番 -畫 -異 -畲 -畳 -畴 -當 -畸 -畹 -畿 -疆 -疇 -疊 -疏 -疑 -疔 -疖 -疗 -疙 -疚 -疝 -疟 -疡 -疣 -疤 -疥 -疫 -疮 -疯 -疱 -疲 -疳 -疵 -疸 -疹 -疼 -疽 -疾 -痂 -病 -症 -痈 -痉 -痊 -痍 -痒 -痔 -痕 -痘 -痙 -痛 -痞 -痠 -痢 -痣 -痤 -痧 -痨 -痪 -痫 -痰 -痱 -痴 -痹 -痺 -痼 -痿 -瘀 -瘁 -瘋 -瘍 -瘓 -瘘 -瘙 -瘟 -瘠 -瘡 -瘢 -瘤 -瘦 -瘧 -瘩 -瘪 -瘫 -瘴 -瘸 -瘾 -療 -癇 -癌 -癒 -癖 -癜 -癞 -癡 -癢 -癣 -癥 -癫 -癬 -癮 -癱 -癲 -癸 -発 -登 -發 -白 -百 -皂 -的 -皆 -皇 -皈 -皋 -皎 -皑 -皓 -皖 -皙 -皚 -皮 -皰 -皱 -皴 -皺 -皿 -盂 -盃 -盅 -盆 -盈 -益 -盎 -盏 -盐 -监 -盒 -盔 -盖 -盗 -盘 -盛 -盜 -盞 -盟 -盡 -監 -盤 -盥 -盧 -盪 -目 -盯 -盱 -盲 -直 -相 -盹 -盼 -盾 -省 -眈 -眉 -看 -県 -眙 -眞 -真 -眠 -眦 -眨 -眩 -眯 -眶 -眷 -眸 -眺 -眼 -眾 -着 -睁 -睇 -睏 -睐 -睑 -睛 -睜 -睞 -睡 -睢 -督 -睥 -睦 -睨 -睪 -睫 -睬 -睹 -睽 -睾 -睿 -瞄 -瞅 -瞇 -瞋 -瞌 -瞎 -瞑 -瞒 -瞓 -瞞 -瞟 -瞠 -瞥 -瞧 -瞩 -瞪 -瞬 -瞭 -瞰 -瞳 -瞻 -瞼 -瞿 -矇 -矍 -矗 -矚 -矛 -矜 -矢 -矣 -知 -矩 -矫 -短 -矮 -矯 -石 -矶 -矽 -矾 -矿 -码 -砂 -砌 -砍 -砒 -研 -砖 -砗 -砚 -砝 -砣 -砥 -砧 -砭 -砰 -砲 -破 -砷 -砸 -砺 -砼 -砾 -础 -硅 -硐 -硒 -硕 -硝 -硫 -硬 -确 -硯 -硼 -碁 -碇 -碉 -碌 -碍 -碎 -碑 -碓 -碗 -碘 -碚 -碛 -碟 -碣 -碧 -碩 -碰 -碱 -碳 -碴 -確 -碼 -碾 -磁 -磅 -磊 -磋 -磐 -磕 -磚 -磡 -磨 -磬 -磯 -磲 -磷 -磺 -礁 -礎 -礙 -礡 -礦 -礪 -礫 -礴 -示 -礼 -社 -祀 -祁 -祂 -祇 -祈 -祉 -祎 -祐 -祕 -祖 -祗 -祚 -祛 -祜 -祝 -神 -祟 -祠 -祢 -祥 -票 -祭 -祯 -祷 -祸 -祺 -祿 -禀 -禁 -禄 -禅 -禍 -禎 -福 -禛 -禦 -禧 -禪 -禮 -禱 -禹 -禺 -离 -禽 -禾 -禿 -秀 -私 -秃 -秆 -秉 -秋 -种 -科 -秒 -秘 -租 -秣 -秤 -秦 -秧 -秩 -秭 -积 -称 -秸 -移 -秽 -稀 -稅 -程 -稍 -税 -稔 -稗 -稚 -稜 -稞 -稟 -稠 -稣 -種 -稱 -稲 -稳 -稷 -稹 -稻 -稼 -稽 -稿 -穀 -穂 -穆 -穌 -積 -穎 -穗 -穢 -穩 -穫 -穴 -究 -穷 -穹 -空 -穿 -突 -窃 -窄 -窈 -窍 -窑 -窒 -窓 -窕 -窖 -窗 -窘 -窜 -窝 -窟 -窠 -窥 -窦 -窨 -窩 -窪 -窮 -窯 -窺 -窿 -竄 -竅 -竇 -竊 -立 -竖 -站 -竜 -竞 -竟 -章 -竣 -童 -竭 -端 -競 -竹 -竺 -竽 -竿 -笃 -笆 -笈 -笋 -笏 -笑 -笔 -笙 -笛 -笞 -笠 -符 -笨 -第 -笹 -笺 -笼 -筆 -等 -筊 -筋 -筍 -筏 -筐 -筑 -筒 -答 -策 -筛 -筝 -筠 -筱 -筲 -筵 -筷 -筹 -签 -简 -箇 -箋 -箍 -箏 -箐 -箔 -箕 -算 -箝 -管 -箩 -箫 -箭 -箱 -箴 -箸 -節 -篁 -範 -篆 -篇 -築 -篑 -篓 -篙 -篝 -篠 -篡 -篤 -篩 -篪 -篮 -篱 -篷 -簇 -簌 -簍 -簡 -簦 -簧 -簪 -簫 -簷 -簸 -簽 -簾 -簿 -籁 -籃 -籌 -籍 -籐 -籟 -籠 -籤 -籬 -籮 -籲 -米 -类 -籼 -籽 -粄 -粉 -粑 -粒 -粕 -粗 -粘 -粟 -粤 -粥 -粧 -粪 -粮 -粱 -粲 -粳 -粵 -粹 -粼 -粽 -精 -粿 -糅 -糊 -糍 -糕 -糖 -糗 -糙 -糜 -糞 -糟 -糠 -糧 -糬 -糯 -糰 -糸 -系 -糾 -紀 -紂 -約 -紅 -紉 -紊 -紋 -納 -紐 -紓 -純 -紗 -紘 -紙 -級 -紛 -紜 -素 -紡 -索 -紧 -紫 -紮 -累 -細 -紳 -紹 -紺 -終 -絃 -組 -絆 -経 -結 -絕 -絞 -絡 -絢 -給 -絨 -絮 -統 -絲 -絳 -絵 -絶 -絹 -綁 -綏 -綑 -經 -継 -続 -綜 -綠 -綢 -綦 -綫 -綬 -維 -綱 -網 -綴 -綵 -綸 -綺 -綻 -綽 -綾 -綿 -緊 -緋 -総 -緑 -緒 -緘 -線 -緝 -緞 -締 -緣 -編 -緩 -緬 -緯 -練 -緹 -緻 -縁 -縄 -縈 -縛 -縝 -縣 -縫 -縮 -縱 -縴 -縷 -總 -績 -繁 -繃 -繆 -繇 -繋 -織 -繕 -繚 -繞 -繡 -繩 -繪 -繫 -繭 -繳 -繹 -繼 -繽 -纂 -續 -纍 -纏 -纓 -纔 -纖 -纜 -纠 -红 -纣 -纤 -约 -级 -纨 -纪 -纫 -纬 -纭 -纯 -纰 -纱 -纲 -纳 -纵 -纶 -纷 -纸 -纹 -纺 -纽 -纾 -线 -绀 -练 -组 -绅 -细 -织 -终 -绊 -绍 -绎 -经 -绑 -绒 -结 -绔 -绕 -绘 -给 -绚 -绛 -络 -绝 -绞 -统 -绡 -绢 -绣 -绥 -绦 -继 -绩 -绪 -绫 -续 -绮 -绯 -绰 -绳 -维 -绵 -绶 -绷 -绸 -绻 -综 -绽 -绾 -绿 -缀 -缄 -缅 -缆 -缇 -缈 -缉 -缎 -缓 -缔 -缕 -编 -缘 -缙 -缚 -缜 -缝 -缠 -缢 -缤 -缥 -缨 -缩 -缪 -缭 -缮 -缰 -缱 -缴 -缸 -缺 -缽 -罂 -罄 -罌 -罐 -网 -罔 -罕 -罗 -罚 -罡 -罢 -罩 -罪 -置 -罰 -署 -罵 -罷 -罹 -羁 -羅 -羈 -羊 -羌 -美 -羔 -羚 -羞 -羟 -羡 -羣 -群 -羥 -羧 -羨 -義 -羯 -羲 -羸 -羹 -羽 -羿 -翁 -翅 -翊 -翌 -翎 -習 -翔 -翘 -翟 -翠 -翡 -翦 -翩 -翰 -翱 -翳 -翹 -翻 -翼 -耀 -老 -考 -耄 -者 -耆 -耋 -而 -耍 -耐 -耒 -耕 -耗 -耘 -耙 -耦 -耨 -耳 -耶 -耷 -耸 -耻 -耽 -耿 -聂 -聆 -聊 -聋 -职 -聒 -联 -聖 -聘 -聚 -聞 -聪 -聯 -聰 -聲 -聳 -聴 -聶 -職 -聽 -聾 -聿 -肃 -肄 -肅 -肆 -肇 -肉 -肋 -肌 -肏 -肓 -肖 -肘 -肚 -肛 -肝 -肠 -股 -肢 -肤 -肥 -肩 -肪 -肮 -肯 -肱 -育 -肴 -肺 -肽 -肾 -肿 -胀 -胁 -胃 -胄 -胆 -背 -胍 -胎 -胖 -胚 -胛 -胜 -胝 -胞 -胡 -胤 -胥 -胧 -胫 -胭 -胯 -胰 -胱 -胳 -胴 -胶 -胸 -胺 -能 -脂 -脅 -脆 -脇 -脈 -脉 -脊 -脍 -脏 -脐 -脑 -脓 -脖 -脘 -脚 -脛 -脣 -脩 -脫 -脯 -脱 -脲 -脳 -脸 -脹 -脾 -腆 -腈 -腊 -腋 -腌 -腎 -腐 -腑 -腓 -腔 -腕 -腥 -腦 -腩 -腫 -腭 -腮 -腰 -腱 -腳 -腴 -腸 -腹 -腺 -腻 -腼 -腾 -腿 -膀 -膈 -膊 -膏 -膑 -膘 -膚 -膛 -膜 -膝 -膠 -膦 -膨 -膩 -膳 -膺 -膻 -膽 -膾 -膿 -臀 -臂 -臃 -臆 -臉 -臊 -臍 -臓 -臘 -臟 -臣 -臥 -臧 -臨 -自 -臬 -臭 -至 -致 -臺 -臻 -臼 -臾 -舀 -舂 -舅 -舆 -與 -興 -舉 -舊 -舌 -舍 -舎 -舐 -舒 -舔 -舖 -舗 -舛 -舜 -舞 -舟 -航 -舫 -般 -舰 -舱 -舵 -舶 -舷 -舸 -船 -舺 -舾 -艇 -艋 -艘 -艙 -艦 -艮 -良 -艰 -艱 -色 -艳 -艷 -艹 -艺 -艾 -节 -芃 -芈 -芊 -芋 -芍 -芎 -芒 -芙 -芜 -芝 -芡 -芥 -芦 -芩 -芪 -芫 -芬 -芭 -芮 -芯 -花 -芳 -芷 -芸 -芹 -芻 -芽 -芾 -苁 -苄 -苇 -苋 -苍 -苏 -苑 -苒 -苓 -苔 -苕 -苗 -苛 -苜 -苞 -苟 -苡 -苣 -若 -苦 -苫 -苯 -英 -苷 -苹 -苻 -茁 -茂 -范 -茄 -茅 -茉 -茎 -茏 -茗 -茜 -茧 -茨 -茫 -茬 -茭 -茯 -茱 -茲 -茴 -茵 -茶 -茸 -茹 -茼 -荀 -荃 -荆 -草 -荊 -荏 -荐 -荒 -荔 -荖 -荘 -荚 -荞 -荟 -荠 -荡 -荣 -荤 -荥 -荧 -荨 -荪 -荫 -药 -荳 -荷 -荸 -荻 -荼 -荽 -莅 -莆 -莉 -莊 -莎 -莒 -莓 -莖 -莘 -莞 -莠 -莢 -莧 -莪 -莫 -莱 -莲 -莴 -获 -莹 -莺 -莽 -莿 -菀 -菁 -菅 -菇 -菈 -菊 -菌 -菏 -菓 -菖 -菘 -菜 -菟 -菠 -菡 -菩 -華 -菱 -菲 -菸 -菽 -萁 -萃 -萄 -萊 -萋 -萌 -萍 -萎 -萘 -萝 -萤 -营 -萦 -萧 -萨 -萩 -萬 -萱 -萵 -萸 -萼 -落 -葆 -葉 -著 -葚 -葛 -葡 -董 -葦 -葩 -葫 -葬 -葭 -葯 -葱 -葳 -葵 -葷 -葺 -蒂 -蒋 -蒐 -蒔 -蒙 -蒜 -蒞 -蒟 -蒡 -蒨 -蒲 -蒸 -蒹 -蒻 -蒼 -蒿 -蓁 -蓄 -蓆 -蓉 -蓋 -蓑 -蓓 -蓖 -蓝 -蓟 -蓦 -蓬 -蓮 -蓼 -蓿 -蔑 -蔓 -蔔 -蔗 -蔘 -蔚 -蔡 -蔣 -蔥 -蔫 -蔬 -蔭 -蔵 -蔷 -蔺 -蔻 -蔼 -蔽 -蕁 -蕃 -蕈 -蕉 -蕊 -蕎 -蕙 -蕤 -蕨 -蕩 -蕪 -蕭 -蕲 -蕴 -蕻 -蕾 -薄 -薅 -薇 -薈 -薊 -薏 -薑 -薔 -薙 -薛 -薦 -薨 -薩 -薪 -薬 -薯 -薰 -薹 -藉 -藍 -藏 -藐 -藓 -藕 -藜 -藝 -藤 -藥 -藩 -藹 -藻 -藿 -蘆 -蘇 -蘊 -蘋 -蘑 -蘚 -蘭 -蘸 -蘼 -蘿 -虎 -虏 -虐 -虑 -虔 -處 -虚 -虛 -虜 -虞 -號 -虢 -虧 -虫 -虬 -虱 -虹 -虻 -虽 -虾 -蚀 -蚁 -蚂 -蚊 -蚌 -蚓 -蚕 -蚜 -蚝 -蚣 -蚤 -蚩 -蚪 -蚯 -蚱 -蚵 -蛀 -蛆 -蛇 -蛊 -蛋 -蛎 -蛐 -蛔 -蛙 -蛛 -蛟 -蛤 -蛭 -蛮 -蛰 -蛳 -蛹 -蛻 -蛾 -蜀 -蜂 -蜃 -蜆 -蜇 -蜈 -蜊 -蜍 -蜒 -蜓 -蜕 -蜗 -蜘 -蜚 -蜜 -蜡 -蜢 -蜥 -蜱 -蜴 -蜷 -蜻 -蜿 -蝇 -蝈 -蝉 -蝌 -蝎 -蝕 -蝗 -蝙 -蝟 -蝠 -蝦 -蝨 -蝴 -蝶 -蝸 -蝼 -螂 -螃 -融 -螞 -螢 -螨 -螯 -螳 -螺 -蟀 -蟄 -蟆 -蟋 -蟎 -蟑 -蟒 -蟠 -蟬 -蟲 -蟹 -蟻 -蟾 -蠅 -蠍 -蠔 -蠕 -蠛 -蠟 -蠡 -蠢 -蠣 -蠱 -蠶 -蠹 -蠻 -血 -衄 -衅 -衆 -行 -衍 -術 -衔 -街 -衙 -衛 -衝 -衞 -衡 -衢 -衣 -补 -表 -衩 -衫 -衬 -衮 -衰 -衲 -衷 -衹 -衾 -衿 -袁 -袂 -袄 -袅 -袈 -袋 -袍 -袒 -袖 -袜 -袞 -袤 -袪 -被 -袭 -袱 -裁 -裂 -装 -裆 -裊 -裏 -裔 -裕 -裘 -裙 -補 -裝 -裟 -裡 -裤 -裨 -裱 -裳 -裴 -裸 -裹 -製 -裾 -褂 -複 -褐 -褒 -褓 -褔 -褚 -褥 -褪 -褫 -褲 -褶 -褻 -襁 -襄 -襟 -襠 -襪 -襬 -襯 -襲 -西 -要 -覃 -覆 -覇 -見 -規 -覓 -視 -覚 -覦 -覧 -親 -覬 -観 -覷 -覺 -覽 -觀 -见 -观 -规 -觅 -视 -览 -觉 -觊 -觎 -觐 -觑 -角 -觞 -解 -觥 -触 -觸 -言 -訂 -計 -訊 -討 -訓 -訕 -訖 -託 -記 -訛 -訝 -訟 -訣 -訥 -訪 -設 -許 -訳 -訴 -訶 -診 -註 -証 -詆 -詐 -詔 -評 -詛 -詞 -詠 -詡 -詢 -詣 -試 -詩 -詫 -詬 -詭 -詮 -詰 -話 -該 -詳 -詹 -詼 -誅 -誇 -誉 -誌 -認 -誓 -誕 -誘 -語 -誠 -誡 -誣 -誤 -誥 -誦 -誨 -說 -説 -読 -誰 -課 -誹 -誼 -調 -諄 -談 -請 -諏 -諒 -論 -諗 -諜 -諡 -諦 -諧 -諫 -諭 -諮 -諱 -諳 -諷 -諸 -諺 -諾 -謀 -謁 -謂 -謄 -謊 -謎 -謐 -謔 -謗 -謙 -講 -謝 -謠 -謨 -謬 -謹 -謾 -譁 -證 -譎 -譏 -識 -譙 -譚 -譜 -警 -譬 -譯 -議 -譲 -譴 -護 -譽 -讀 -變 -讓 -讚 -讞 -计 -订 -认 -讥 -讧 -讨 -让 -讪 -讫 -训 -议 -讯 -记 -讲 -讳 -讴 -讶 -讷 -许 -讹 -论 -讼 -讽 -设 -访 -诀 -证 -诃 -评 -诅 -识 -诈 -诉 -诊 -诋 -词 -诏 -译 -试 -诗 -诘 -诙 -诚 -诛 -话 -诞 -诟 -诠 -诡 -询 -诣 -诤 -该 -详 -诧 -诩 -诫 -诬 -语 -误 -诰 -诱 -诲 -说 -诵 -诶 -请 -诸 -诺 -读 -诽 -课 -诿 -谀 -谁 -调 -谄 -谅 -谆 -谈 -谊 -谋 -谌 -谍 -谎 -谏 -谐 -谑 -谒 -谓 -谔 -谕 -谗 -谘 -谙 -谚 -谛 -谜 -谟 -谢 -谣 -谤 -谥 -谦 -谧 -谨 -谩 -谪 -谬 -谭 -谯 -谱 -谲 -谴 -谶 -谷 -豁 -豆 -豇 -豈 -豉 -豊 -豌 -豎 -豐 -豔 -豚 -象 -豢 -豪 -豫 -豬 -豹 -豺 -貂 -貅 -貌 -貓 -貔 -貘 -貝 -貞 -負 -財 -貢 -貧 -貨 -販 -貪 -貫 -責 -貯 -貰 -貳 -貴 -貶 -買 -貸 -費 -貼 -貽 -貿 -賀 -賁 -賂 -賃 -賄 -資 -賈 -賊 -賑 -賓 -賜 -賞 -賠 -賡 -賢 -賣 -賤 -賦 -質 -賬 -賭 -賴 -賺 -購 -賽 -贅 -贈 -贊 -贍 -贏 -贓 -贖 -贛 -贝 -贞 -负 -贡 -财 -责 -贤 -败 -账 -货 -质 -贩 -贪 -贫 -贬 -购 -贮 -贯 -贰 -贱 -贲 -贴 -贵 -贷 -贸 -费 -贺 -贻 -贼 -贾 -贿 -赁 -赂 -赃 -资 -赅 -赈 -赊 -赋 -赌 -赎 -赏 -赐 -赓 -赔 -赖 -赘 -赚 -赛 -赝 -赞 -赠 -赡 -赢 -赣 -赤 -赦 -赧 -赫 -赭 -走 -赳 -赴 -赵 -赶 -起 -趁 -超 -越 -趋 -趕 -趙 -趟 -趣 -趨 -足 -趴 -趵 -趸 -趺 -趾 -跃 -跄 -跆 -跋 -跌 -跎 -跑 -跖 -跚 -跛 -距 -跟 -跡 -跤 -跨 -跩 -跪 -路 -跳 -践 -跷 -跹 -跺 -跻 -踉 -踊 -踌 -踏 -踐 -踝 -踞 -踟 -踢 -踩 -踪 -踮 -踱 -踴 -踵 -踹 -蹂 -蹄 -蹇 -蹈 -蹉 -蹊 -蹋 -蹑 -蹒 -蹙 -蹟 -蹣 -蹤 -蹦 -蹩 -蹬 -蹭 -蹲 -蹴 -蹶 -蹺 -蹼 -蹿 -躁 -躇 -躉 -躊 -躋 -躍 -躏 -躪 -身 -躬 -躯 -躲 -躺 -軀 -車 -軋 -軌 -軍 -軒 -軟 -転 -軸 -軼 -軽 -軾 -較 -載 -輒 -輓 -輔 -輕 -輛 -輝 -輟 -輩 -輪 -輯 -輸 -輻 -輾 -輿 -轄 -轅 -轆 -轉 -轍 -轎 -轟 -车 -轧 -轨 -轩 -转 -轭 -轮 -软 -轰 -轲 -轴 -轶 -轻 -轼 -载 -轿 -较 -辄 -辅 -辆 -辇 -辈 -辉 -辊 -辍 -辐 -辑 -输 -辕 -辖 -辗 -辘 -辙 -辛 -辜 -辞 -辟 -辣 -辦 -辨 -辩 -辫 -辭 -辮 -辯 -辰 -辱 -農 -边 -辺 -辻 -込 -辽 -达 -迁 -迂 -迄 -迅 -过 -迈 -迎 -运 -近 -返 -还 -这 -进 -远 -违 -连 -迟 -迢 -迤 -迥 -迦 -迩 -迪 -迫 -迭 -述 -迴 -迷 -迸 -迹 -迺 -追 -退 -送 -适 -逃 -逅 -逆 -选 -逊 -逍 -透 -逐 -递 -途 -逕 -逗 -這 -通 -逛 -逝 -逞 -速 -造 -逢 -連 -逮 -週 -進 -逵 -逶 -逸 -逻 -逼 -逾 -遁 -遂 -遅 -遇 -遊 -運 -遍 -過 -遏 -遐 -遑 -遒 -道 -達 -違 -遗 -遙 -遛 -遜 -遞 -遠 -遢 -遣 -遥 -遨 -適 -遭 -遮 -遲 -遴 -遵 -遶 -遷 -選 -遺 -遼 -遽 -避 -邀 -邁 -邂 -邃 -還 -邇 -邈 -邊 -邋 -邏 -邑 -邓 -邕 -邛 -邝 -邢 -那 -邦 -邨 -邪 -邬 -邮 -邯 -邰 -邱 -邳 -邵 -邸 -邹 -邺 -邻 -郁 -郅 -郊 -郎 -郑 -郜 -郝 -郡 -郢 -郤 -郦 -郧 -部 -郫 -郭 -郴 -郵 -郷 -郸 -都 -鄂 -鄉 -鄒 -鄔 -鄙 -鄞 -鄢 -鄧 -鄭 -鄰 -鄱 -鄲 -鄺 -酉 -酊 -酋 -酌 -配 -酐 -酒 -酗 -酚 -酝 -酢 -酣 -酥 -酩 -酪 -酬 -酮 -酯 -酰 -酱 -酵 -酶 -酷 -酸 -酿 -醃 -醇 -醉 -醋 -醍 -醐 -醒 -醚 -醛 -醜 -醞 -醣 -醪 -醫 -醬 -醮 -醯 -醴 -醺 -釀 -釁 -采 -釉 -释 -釋 -里 -重 -野 -量 -釐 -金 -釗 -釘 -釜 -針 -釣 -釦 -釧 -釵 -鈀 -鈉 -鈍 -鈎 -鈔 -鈕 -鈞 -鈣 -鈦 -鈪 -鈴 -鈺 -鈾 -鉀 -鉄 -鉅 -鉉 -鉑 -鉗 -鉚 -鉛 -鉤 -鉴 -鉻 -銀 -銃 -銅 -銑 -銓 -銖 -銘 -銜 -銬 -銭 -銮 -銳 -銷 -銹 -鋁 -鋅 -鋒 -鋤 -鋪 -鋰 -鋸 -鋼 -錄 -錐 -錘 -錚 -錠 -錢 -錦 -錨 -錫 -錮 -錯 -録 -錳 -錶 -鍊 -鍋 -鍍 -鍛 -鍥 -鍰 -鍵 -鍺 -鍾 -鎂 -鎊 -鎌 -鎏 -鎔 -鎖 -鎗 -鎚 -鎧 -鎬 -鎮 -鎳 -鏈 -鏖 -鏗 -鏘 -鏞 -鏟 -鏡 -鏢 -鏤 -鏽 -鐘 -鐮 -鐲 -鐳 -鐵 -鐸 -鐺 -鑄 -鑊 -鑑 -鑒 -鑣 -鑫 -鑰 -鑲 -鑼 -鑽 -鑾 -鑿 -针 -钉 -钊 -钎 -钏 -钒 -钓 -钗 -钙 -钛 -钜 -钝 -钞 -钟 -钠 -钡 -钢 -钣 -钤 -钥 -钦 -钧 -钨 -钩 -钮 -钯 -钰 -钱 -钳 -钴 -钵 -钺 -钻 -钼 -钾 -钿 -铀 -铁 -铂 -铃 -铄 -铅 -铆 -铉 -铎 -铐 -铛 -铜 -铝 -铠 -铡 -铢 -铣 -铤 -铨 -铩 -铬 -铭 -铮 -铰 -铲 -铵 -银 -铸 -铺 -链 -铿 -销 -锁 -锂 -锄 -锅 -锆 -锈 -锉 -锋 -锌 -锏 -锐 -锑 -错 -锚 -锟 -锡 -锢 -锣 -锤 -锥 -锦 -锭 -键 -锯 -锰 -锲 -锵 -锹 -锺 -锻 -镀 -镁 -镂 -镇 -镉 -镌 -镍 -镐 -镑 -镕 -镖 -镗 -镛 -镜 -镣 -镭 -镯 -镰 -镳 -镶 -長 -长 -門 -閃 -閉 -開 -閎 -閏 -閑 -閒 -間 -閔 -閘 -閡 -関 -閣 -閥 -閨 -閩 -閱 -閲 -閹 -閻 -閾 -闆 -闇 -闊 -闌 -闍 -闔 -闕 -闖 -闘 -關 -闡 -闢 -门 -闪 -闫 -闭 -问 -闯 -闰 -闲 -间 -闵 -闷 -闸 -闹 -闺 -闻 -闽 -闾 -阀 -阁 -阂 -阅 -阆 -阇 -阈 -阉 -阎 -阐 -阑 -阔 -阕 -阖 -阙 -阚 -阜 -队 -阡 -阪 -阮 -阱 -防 -阳 -阴 -阵 -阶 -阻 -阿 -陀 -陂 -附 -际 -陆 -陇 -陈 -陋 -陌 -降 -限 -陕 -陛 -陝 -陞 -陟 -陡 -院 -陣 -除 -陨 -险 -陪 -陰 -陲 -陳 -陵 -陶 -陷 -陸 -険 -陽 -隅 -隆 -隈 -隊 -隋 -隍 -階 -随 -隐 -隔 -隕 -隘 -隙 -際 -障 -隠 -隣 -隧 -隨 -險 -隱 -隴 -隶 -隸 -隻 -隼 -隽 -难 -雀 -雁 -雄 -雅 -集 -雇 -雉 -雋 -雌 -雍 -雎 -雏 -雑 -雒 -雕 -雖 -雙 -雛 -雜 -雞 -離 -難 -雨 -雪 -雯 -雰 -雲 -雳 -零 -雷 -雹 -電 -雾 -需 -霁 -霄 -霆 -震 -霈 -霉 -霊 -霍 -霎 -霏 -霑 -霓 -霖 -霜 -霞 -霧 -霭 -霰 -露 -霸 -霹 -霽 -霾 -靂 -靄 -靈 -青 -靓 -靖 -静 -靚 -靛 -靜 -非 -靠 -靡 -面 -靥 -靦 -革 -靳 -靴 -靶 -靼 -鞅 -鞋 -鞍 -鞏 -鞑 -鞘 -鞠 -鞣 -鞦 -鞭 -韆 -韋 -韌 -韓 -韜 -韦 -韧 -韩 -韬 -韭 -音 -韵 -韶 -韻 -響 -頁 -頂 -頃 -項 -順 -須 -頌 -預 -頑 -頒 -頓 -頗 -領 -頜 -頡 -頤 -頫 -頭 -頰 -頷 -頸 -頹 -頻 -頼 -顆 -題 -額 -顎 -顏 -顔 -願 -顛 -類 -顧 -顫 -顯 -顱 -顴 -页 -顶 -顷 -项 -顺 -须 -顼 -顽 -顾 -顿 -颁 -颂 -预 -颅 -领 -颇 -颈 -颉 -颊 -颌 -颍 -颐 -频 -颓 -颔 -颖 -颗 -题 -颚 -颛 -颜 -额 -颞 -颠 -颡 -颢 -颤 -颦 -颧 -風 -颯 -颱 -颳 -颶 -颼 -飄 -飆 -风 -飒 -飓 -飕 -飘 -飙 -飚 -飛 -飞 -食 -飢 -飨 -飩 -飪 -飯 -飲 -飼 -飽 -飾 -餃 -餅 -餉 -養 -餌 -餐 -餒 -餓 -餘 -餚 -餛 -餞 -餡 -館 -餮 -餵 -餾 -饅 -饈 -饋 -饌 -饍 -饑 -饒 -饕 -饗 -饞 -饥 -饨 -饪 -饬 -饭 -饮 -饯 -饰 -饱 -饲 -饴 -饵 -饶 -饷 -饺 -饼 -饽 -饿 -馀 -馁 -馄 -馅 -馆 -馈 -馋 -馍 -馏 -馒 -馔 -首 -馗 -香 -馥 -馨 -馬 -馭 -馮 -馳 -馴 -駁 -駄 -駅 -駆 -駐 -駒 -駕 -駛 -駝 -駭 -駱 -駿 -騁 -騎 -騏 -験 -騙 -騨 -騰 -騷 -驀 -驅 -驊 -驍 -驒 -驕 -驗 -驚 -驛 -驟 -驢 -驥 -马 -驭 -驮 -驯 -驰 -驱 -驳 -驴 -驶 -驷 -驸 -驹 -驻 -驼 -驾 -驿 -骁 -骂 -骄 -骅 -骆 -骇 -骈 -骊 -骋 -验 -骏 -骐 -骑 -骗 -骚 -骛 -骜 -骞 -骠 -骡 -骤 -骥 -骧 -骨 -骯 -骰 -骶 -骷 -骸 -骼 -髂 -髅 -髋 -髏 -髒 -髓 -體 -髖 -高 -髦 -髪 -髮 -髯 -髻 -鬃 -鬆 -鬍 -鬓 -鬚 -鬟 -鬢 -鬣 -鬥 -鬧 -鬱 -鬼 -魁 -魂 -魄 -魅 -魇 -魍 -魏 -魔 -魘 -魚 -魯 -魷 -鮑 -鮨 -鮪 -鮭 -鮮 -鯉 -鯊 -鯖 -鯛 -鯨 -鯰 -鯽 -鰍 -鰓 -鰭 -鰲 -鰻 -鰾 -鱈 -鱉 -鱔 -鱗 -鱷 -鱸 -鱼 -鱿 -鲁 -鲈 -鲍 -鲑 -鲛 -鲜 -鲟 -鲢 -鲤 -鲨 -鲫 -鲱 -鲲 -鲶 -鲷 -鲸 -鳃 -鳄 -鳅 -鳌 -鳍 -鳕 -鳖 -鳗 -鳝 -鳞 -鳥 -鳩 -鳳 -鳴 -鳶 -鴉 -鴕 -鴛 -鴦 -鴨 -鴻 -鴿 -鵑 -鵜 -鵝 -鵡 -鵬 -鵰 -鵲 -鶘 -鶩 -鶯 -鶴 -鷗 -鷲 -鷹 -鷺 -鸚 -鸞 -鸟 -鸠 -鸡 -鸢 -鸣 -鸥 -鸦 -鸨 -鸪 -鸭 -鸯 -鸳 -鸵 -鸽 -鸾 -鸿 -鹂 -鹃 -鹄 -鹅 -鹈 -鹉 -鹊 -鹌 -鹏 -鹑 -鹕 -鹘 -鹜 -鹞 -鹤 -鹦 -鹧 -鹫 -鹭 -鹰 -鹳 -鹵 -鹹 -鹼 -鹽 -鹿 -麂 -麋 -麒 -麓 -麗 -麝 -麟 -麥 -麦 -麩 -麴 -麵 -麸 -麺 -麻 -麼 -麽 -麾 -黃 -黄 -黍 -黎 -黏 -黑 -黒 -黔 -默 -黛 -黜 -黝 -點 -黠 -黨 -黯 -黴 -鼋 -鼎 -鼐 -鼓 -鼠 -鼬 -鼹 -鼻 -鼾 -齁 -齊 -齋 -齐 -齒 -齡 -齢 -齣 -齦 -齿 -龄 -龅 -龈 -龊 -龋 -龌 -龍 -龐 -龔 -龕 -龙 -龚 -龛 -龜 -龟 -︰ -︱ -︶ -︿ -﹁ -﹂ -﹍ -﹏ -﹐ -﹑ -﹒ -﹔ -﹕ -﹖ -﹗ -﹙ -﹚ -﹝ -﹞ -﹡ -﹣ -! -" -# -$ -% -& -' -( -) -* -+ -, -- -. -/ -0 -1 -2 -3 -4 -5 -6 -7 -8 -9 -: -; -< -= -> -? -@ -[ -\ -] -^ -_ -` -a -b -c -d -e -f -g -h -i -j -k -l -m -n -o -p -q -r -s -t -u -v -w -x -y -z -{ -| -} -~ -。 -「 -」 -、 -・ -ッ -ー -イ -ク -シ -ス -ト -ノ -フ -ラ -ル -ン -゙ -゚ - ̄ -¥ -👍 -🔥 -😂 -😎 -... -yam -10 -2017 -12 -11 -2016 -20 -30 -15 -06 -lofter -##s -2015 -by -16 -14 -18 -13 -24 -17 -2014 -21 -##0 -22 -19 -25 -23 -com -100 -00 -05 -2013 -##a -03 -09 -08 -28 -##2 -50 -01 -04 -##1 -27 -02 -2012 -##3 -26 -##e -07 -##8 -##5 -##6 -##4 -##9 -##7 -29 -2011 -40 -##t -2010 -##o -##d -##i -2009 -##n -app -www -the -##m -31 -##c -##l -##y -##r -##g -2008 -60 -http -200 -qq -##p -80 -##f -google -pixnet -90 -cookies -tripadvisor -500 -##er -##k -35 -##h -facebook -2007 -2000 -70 -##b -of -##x -##u -45 -300 -iphone -32 -1000 -2006 -48 -ip -36 -in -38 -3d -##w -##ing -55 -ctrip -##on -##v -33 -##の -to -34 -400 -id -2005 -it -37 -windows -llc -top -99 -42 -39 -000 -led -at -##an -41 -51 -52 -46 -49 -43 -53 -44 -##z -android -58 -and -59 -2004 -56 -vr -##か -5000 -2003 -47 -blogthis -twitter -54 -##le -150 -ok -2018 -57 -75 -cn -no -ios -##in -##mm -##00 -800 -on -te -3000 -65 -2001 -360 -95 -ig -lv -120 -##ng -##を -##us -##に -pc -てす -── -600 -##te -85 -2002 -88 -##ed -html -ncc -wifi -email -64 -blog -is -##10 -##て -mail -online -##al -dvd -##ic -studio -##は -##℃ -##ia -##と -line -vip -72 -##q -98 -##ce -##en -for -##is -##ra -##es -##j -usb -net -cp -1999 -asia -4g -##cm -diy -new -3c -##お -ta -66 -language -vs -apple -tw -86 -web -##ne -ipad -62 -you -##re -101 -68 -##tion -ps -de -bt -pony -atm -##2017 -1998 -67 -##ch -ceo -##or -go -##na -av -pro -cafe -96 -pinterest -97 -63 -pixstyleme3c -##ta -more -said -##2016 -1997 -mp3 -700 -##ll -nba -jun -##20 -92 -tv -1995 -pm -61 -76 -nbsp -250 -##ie -linux -##ma -cd -110 -hd -##17 -78 -##ion -77 -6000 -am -##th -##st -94 -##se -##et -69 -180 -gdp -my -105 -81 -abc -89 -flash -79 -one -93 -1990 -1996 -##ck -gps -##も -##ly -web885 -106 -2020 -91 -##ge -4000 -1500 -xd -boss -isbn -1994 -org -##ry -me -love -##11 -0fork -73 -##12 -3g -##ter -##ar -71 -82 -##la -hotel -130 -1970 -pk -83 -87 -140 -ie -##os -##30 -##el -74 -##50 -seo -cpu -##ml -p2p -84 -may -##る -sun -tue -internet -cc -posted -youtube -##at -##ン -##man -ii -##ル -##15 -abs -nt -pdf -yahoo -ago -1980 -##it -news -mac -104 -##てす -##me -##り -java -1992 -spa -##de -##nt -hk -all -plus -la -1993 -##mb -##16 -##ve -west -##da -160 -air -##い -##ps -から -##to -1989 -logo -htc -php -https -fi -momo -##son -sat -##ke -##80 -ebd -suv -wi -day -apk -##88 -##um -mv -galaxy -wiki -or -brake -##ス -1200 -する -this -1991 -mon -##こ -❤2017 -po -##ない -javascript -life -home -june -##ss -system -900 -##ー -##0 -pp -1988 -world -fb -4k -br -##as -ic -ai -leonardo -safari -##60 -live -free -xx -wed -win7 -kiehl -##co -lg -o2o -##go -us -235 -1949 -mm -しい -vfm -kanye -##90 -##2015 -##id -jr -##ey -123 -rss -##sa -##ro -##am -##no -thu -fri -350 -##sh -##ki -103 -comments -name -##のて -##pe -##ine -max -1987 -8000 -uber -##mi -##ton -wordpress -office -1986 -1985 -##ment -107 -bd -win10 -##ld -##li -gmail -bb -dior -##rs -##ri -##rd -##ます -up -cad -##® -dr -して -read -##21 -をお -##io -##99 -url -1984 -pvc -paypal -show -policy -##40 -##ty -##18 -with -##★ -##01 -txt -102 -##ba -dna -from -post -mini -ar -taiwan -john -##ga -privacy -agoda -##13 -##ny -word -##24 -##22 -##by -##ur -##hz -1982 -##ang -265 -cookie -netscape -108 -##ka -##~ -##ad -house -share -note -ibm -code -hello -nike -sim -survey -##016 -1979 -1950 -wikia -##32 -##017 -5g -cbc -##tor -##kg -1983 -##rt -##14 -campaign -store -2500 -os -##ct -##ts -##° -170 -api -##ns -365 -excel -##な -##ao -##ら -##し -~~ -##nd -university -163 -には -518 -##70 -##ya -##il -##25 -pierre -ipo -0020 -897 -##23 -hotels -##ian -のお -125 -years -6606 -##ers -##26 -high -##day -time -##ay -bug -##line -##く -##す -##be -xp -talk2yam -yamservice -10000 -coco -##dy -sony -##ies -1978 -microsoft -david -people -##ha -1960 -instagram -intel -その -##ot -iso -1981 -##va -115 -##mo -##land -xxx -man -co -ltxsw -##ation -baby -220 -##pa -##ol -1945 -7000 -tag -450 -##ue -msn -##31 -oppo -##ト -##ca -control -##om -st -chrome -##ure -##ん -be -##き -lol -##19 -した -##bo -240 -lady -##100 -##way -##から -4600 -##ko -##do -##un -4s -corporation -168 -##ni -herme -##28 -cp -978 -##up -##06 -ui -##ds -ppt -admin -three -します -bbc -re -128 -##48 -ca -##015 -##35 -hp -##ee -tpp -##た -##ive -×× -root -##cc -##ました -##ble -##ity -adobe -park -114 -et -oled -city -##ex -##ler -##ap -china -##book -20000 -view -##ice -global -##km -your -hong -##mg -out -##ms -ng -ebay -##29 -menu -ubuntu -##cy -rom -##view -open -ktv -do -server -##lo -if -english -##ね -##5 -##oo -1600 -##02 -step1 -kong -club -135 -july -inc -1976 -mr -hi -##net -touch -##ls -##ii -michael -lcd -##05 -##33 -phone -james -step2 -1300 -ios9 -##box -dc -##2 -##ley -samsung -111 -280 -pokemon -css -##ent -##les -いいえ -##1 -s8 -atom -play -bmw -##said -sa -etf -ctrl -♥yoyo♥ -##55 -2025 -##2014 -##66 -adidas -amazon -1958 -##ber -##ner -visa -##77 -##der -1800 -connectivity -##hi -firefox -109 -118 -hr -so -style -mark -pop -ol -skip -1975 -as -##27 -##ir -##61 -190 -mba -##う -##ai -le -##ver -1900 -cafe2017 -lte -super -113 -129 -##ron -amd -like -##☆ -are -##ster -we -##sk -paul -data -international -##ft -longchamp -ssd -good -##ート -##ti -reply -##my -↓↓↓ -apr -star -##ker -source -136 -js -112 -get -force -photo -##one -126 -##2013 -##ow -link -bbs -1972 -goods -##lin -python -119 -##ip -game -##ics -##ません -blue -##● -520 -##45 -page -itunes -##03 -1955 -260 -1968 -gt -gif -618 -##ff -##47 -group -くたさい -about -bar -ganji -##nce -music -lee -not -1977 -1971 -1973 -##per -an -faq -comment -##って -days -##ock -116 -##bs -1974 -1969 -v1 -player -1956 -xbox -sql -fm -f1 -139 -##ah -210 -##lv -##mp -##000 -melody -1957 -##3 -550 -17life -199 -1966 -xml -market -##au -##71 -999 -##04 -what -gl -##95 -##age -tips -##68 -book -##ting -mysql -can -1959 -230 -##ung -wonderland -watch -10℃ -##ction -9000 -mar -mobile -1946 -1962 -article -##db -part -▲top -party -って -1967 -1964 -1948 -##07 -##ore -##op -この -dj -##78 -##38 -010 -main -225 -1965 -##ong -art -320 -ad -134 -020 -##73 -117 -pm2 -japan -228 -##08 -ts -1963 -##ica -der -sm -##36 -2019 -##wa -ct -##7 -##や -##64 -1937 -homemesh -search -##85 -##れは -##tv -##di -macbook -##9 -##くたさい -service -##♥ -type -った -750 -##ier -##si -##75 -##います -##ok -best -##ット -goris -lock -##った -cf -3m -big -##ut -ftp -carol -##vi -10 -1961 -happy -sd -##ac -122 -anti -pe -cnn -iii -1920 -138 -##ラ -1940 -esp -jan -tags -##98 -##51 -august -vol -##86 -154 -##™ -##fs -##れ -##sion -design -ac -##ム -press -jordan -ppp -that -key -check -##6 -##tt -##㎡ -1080p -##lt -power -##42 -1952 -##bc -vivi -##ック -he -133 -121 -jpg -##rry -201 -175 -3500 -1947 -nb -##ted -##rn -しています -1954 -usd -##t00 -master -##ンク -001 -model -##58 -al -##09 -1953 -##34 -ram -goo -ても -##ui -127 -1930 -red -##ary -rpg -item -##pm -##41 -270 -##za -project -##2012 -hot -td -blogabstract -##ger -##62 -650 -##44 -gr2 -##します -##m -black -electronic -nfc -year -asus -また -html5 -cindy -##hd -m3 -132 -esc -##od -booking -##53 -fed -tvb -##81 -##ina -mit -165 -##いる -chan -192 -distribution -next -になる -peter -bios -steam -cm -1941 -にも -pk10 -##ix -##65 -##91 -dec -nasa -##ana -icecat -00z -b1 -will -##46 -li -se -##ji -##み -##ard -oct -##ain -jp -##ze -##bi -cio -##56 -smart -h5 -##39 -##port -curve -vpn -##nm -##dia -utc -##あり -12345678910 -##52 -rmvb -chanel -a4 -miss -##and -##im -media -who -##63 -she -girl -5s -124 -vera -##して -class -vivo -king -##フ -##ei -national -ab -1951 -5cm -888 -145 -ipod -ap -1100 -5mm -211 -ms -2756 -##69 -mp4 -msci -##po -##89 -131 -mg -index -380 -##bit -##out -##zz -##97 -##67 -158 -apec -##8 -photoshop -opec -¥799 -ては -##96 -##tes -##ast -2g -○○ -##ール -¥2899 -##ling -##よ -##ory -1938 -##ical -kitty -content -##43 -step3 -##cn -win8 -155 -vc -1400 -iphone7 -robert -##した -tcl -137 -beauty -##87 -en -dollars -##ys -##oc -step -pay -yy -a1 -##2011 -##lly -##ks -##♪ -1939 -188 -download -1944 -sep -exe -ph -います -school -gb -center -pr -street -##board -uv -##37 -##lan -winrar -##que -##ua -##com -1942 -1936 -480 -gpu -##4 -ettoday -fu -tom -##54 -##ren -##via -149 -##72 -b2b -144 -##79 -##tch -rose -arm -mb -##49 -##ial -##nn -nvidia -step4 -mvp -00㎡ -york -156 -##イ -how -cpi -591 -2765 -gov -kg -joe -##xx -mandy -pa -##ser -copyright -fashion -1935 -don -##け -ecu -##ist -##art -erp -wap -have -##lm -talk -##ek -##ning -##if -ch -##ite -video -1943 -cs -san -iot -look -##84 -##2010 -##ku -october -##ux -trump -##hs -##ide -box -141 -first -##ins -april -##ight -##83 -185 -angel -protected -aa -151 -162 -x1 -m2 -##fe -##× -##ho -size -143 -min -ofo -fun -gomaji -ex -hdmi -food -dns -march -chris -kevin -##のか -##lla -##pp -##ec -ag -ems -6s -720p -##rm -##ham -off -##92 -asp -team -fandom -ed -299 -▌♥ -##ell -info -されています -##82 -sina -4066 -161 -##able -##ctor -330 -399 -315 -dll -rights -ltd -idc -jul -3kg -1927 -142 -ma -surface -##76 -##ク -~~~ -304 -mall -eps -146 -green -##59 -map -space -donald -v2 -sodu -##light -1931 -148 -1700 -まて -310 -reserved -htm -##han -##57 -2d -178 -mod -##ise -##tions -152 -ti -##shi -doc -1933 -icp -055 -wang -##ram -shopping -aug -##pi -##well -now -wam -b2 -からお -##hu -236 -1928 -##gb -266 -f2 -##93 -153 -mix -##ef -##uan -bwl -##plus -##res -core -##ess -tea -5℃ -hktvmall -nhk -##ate -list -##ese -301 -feb -4m -inn -ての -nov -159 -12345 -daniel -##ci -pass -##bet -##nk -coffee -202 -ssl -airbnb -##ute -fbi -woshipm -skype -ea -cg -sp -##fc -##www -yes -edge -alt -007 -##94 -fpga -##ght -##gs -iso9001 -さい -##ile -##wood -##uo -image -lin -icon -american -##em -1932 -set -says -##king -##tive -blogger -##74 -なと -256 -147 -##ox -##zy -##red -##ium -##lf -nokia -claire -##リ -##ding -november -lohas -##500 -##tic -##マ -##cs -##ある -##che -##ire -##gy -##ult -db -january -win -##カ -166 -road -ptt -##ま -##つ -198 -##fa -##mer -anna -pchome -はい -udn -ef -420 -##time -##tte -2030 -##ア -g20 -white -かかります -1929 -308 -garden -eleven -di -##おります -chen -309b -777 -172 -young -cosplay -ちてない -4500 -bat -##123 -##tra -##ては -kindle -npc -steve -etc -##ern -##| -call -xperia -ces -travel -sk -s7 -##ous -1934 -##int -みいたたけます -183 -edu -file -cho -qr -##car -##our -186 -##ant -##d -eric -1914 -rends -##jo -##する -mastercard -##2000 -kb -##min -290 -##ino -vista -##ris -##ud -jack -2400 -##set -169 -pos -1912 -##her -##ou -taipei -しく -205 -beta -##ませんか -232 -##fi -express -255 -body -##ill -aphojoy -user -december -meiki -##ick -tweet -richard -##av -##ᆫ -iphone6 -##dd -ちてすか -views -##mark -321 -pd -##00 -times -##▲ -level -##ash -10g -point -5l -##ome -208 -koreanmall -##ak -george -q2 -206 -wma -tcp -##200 -スタッフ -full -mlb -##lle -##watch -tm -run -179 -911 -smith -business -##und -1919 -color -##tal -222 -171 -##less -moon -4399 -##rl -update -pcb -shop -499 -157 -little -なし -end -##mhz -van -dsp -easy -660 -##house -##key -history -##o -oh -##001 -##hy -##web -oem -let -was -##2009 -##gg -review -##wan -182 -##°c -203 -uc -title -##val -united -233 -2021 -##ons -doi -trivago -overdope -sbs -##ance -##ち -grand -special -573032185 -imf -216 -wx17house -##so -##ーム -audi -##he -london -william -##rp -##ake -science -beach -cfa -amp -ps4 -880 -##800 -##link -##hp -crm -ferragamo -bell -make -##eng -195 -under -zh -photos -2300 -##style -##ント -via -176 -da -##gi -company -i7 -##ray -thomas -370 -ufo -i5 -##max -plc -ben -back -research -8g -173 -mike -##pc -##ッフ -september -189 -##ace -vps -february -167 -pantos -wp -lisa -1921 -★★ -jquery -night -long -offer -##berg -##news -1911 -##いて -ray -fks -wto -せます -over -164 -340 -##all -##rus -1924 -##888 -##works -blogtitle -loftpermalink -##→ -187 -martin -test -ling -km -##め -15000 -fda -v3 -##ja -##ロ -wedding -かある -outlet -family -##ea -をこ -##top -story -##ness -salvatore -##lu -204 -swift -215 -room -している -oracle -##ul -1925 -sam -b2c -week -pi -rock -##のは -##a -##けと -##ean -##300 -##gle -cctv -after -chinese -##back -powered -x2 -##tan -1918 -##nes -##イン -canon -only -181 -##zi -##las -say -##oe -184 -##sd -221 -##bot -##world -##zo -sky -made -top100 -just -1926 -pmi -802 -234 -gap -##vr -177 -les -174 -▲topoct -ball -vogue -vi -ing -ofweek -cos -##list -##ort -▲topmay -##なら -##lon -として -last -##tc -##of -##bus -##gen -real -eva -##コ -a3 -nas -##lie -##ria -##coin -##bt -▲topapr -his -212 -cat -nata -vive -health -⋯⋯ -drive -sir -▲topmar -du -cup -##カー -##ook -##よう -##sy -alex -msg -tour -しました -3ce -##word -193 -ebooks -r8 -block -318 -##より -2200 -nice -pvp -207 -months -1905 -rewards -##ther -1917 -0800 -##xi -##チ -##sc -micro -850 -gg -blogfp -op -1922 -daily -m1 -264 -true -##bb -ml -##tar -##のお -##ky -anthony -196 -253 -##yo -state -218 -##ara -##aa -##rc -##tz -##ston -より -gear -##eo -##ade -ge -see -1923 -##win -##ura -ss -heart -##den -##ita -down -##sm -el -png -2100 -610 -rakuten -whatsapp -bay -dream -add -##use -680 -311 -pad -gucci -mpv -##ode -##fo -island -▲topjun -##▼ -223 -jason -214 -chicago -##❤ -しの -##hone -io -##れる -##ことか -sogo -be2 -##ology -990 -cloud -vcd -##con -2~3 -##ford -##joy -##kb -##こさいます -##rade -but -##ach -docker -##ful -rfid -ul -##ase -hit -ford -##star -580 -##○ -11 -a2 -sdk -reading -edited -##are -cmos -##mc -238 -siri -light -##ella -##ため -bloomberg -##read -pizza -##ison -jimmy -##vm -college -node -journal -ba -18k -##play -245 -##cer -20 -magic -##yu -191 -jump -288 -tt -##ings -asr -##lia -3200 -step5 -network -##cd -mc -いします -1234 -pixstyleme -273 -##600 -2800 -money -★★★★★ -1280 -12 -430 -bl -みの -act -##tus -tokyo -##rial -##life -emba -##ae -saas -tcs -##rk -##wang -summer -##sp -ko -##ving -390 -premium -##その -netflix -##ヒ -uk -mt -##lton -right -frank -two -209 -える -##ple -##cal -021 -##んな -##sen -##ville -hold -nexus -dd -##ius -てお -##mah -##なく -tila -zero -820 -ce -##tin -resort -##ws -charles -old -p10 -5d -report -##360 -##ru -##には -bus -vans -lt -##est -pv -##レ -links -rebecca -##ツ -##dm -azure -##365 -きな -limited -bit -4gb -##mon -1910 -moto -##eam -213 -1913 -var -eos -なとの -226 -blogspot -された -699 -e3 -dos -dm -fc -##ments -##ik -##kw -boy -##bin -##ata -960 -er -##せ -219 -##vin -##tu -##ula -194 -##∥ -station -##ろ -##ature -835 -files -zara -hdr -top10 -nature -950 -magazine -s6 -marriott -##シ -avira -case -##っと -tab -##ran -tony -##home -oculus -im -##ral -jean -saint -cry -307 -rosie -##force -##ini -ice -##bert -のある -##nder -##mber -pet -2600 -##◆ -plurk -▲topdec -##sis -00kg -▲topnov -720 -##ence -tim -##ω -##nc -##ても -##name -log -ips -great -ikea -malaysia -unix -##イト -3600 -##ncy -##nie -12000 -akb48 -##ye -##oid -404 -##chi -##いた -oa -xuehai -##1000 -##orm -##rf -275 -さん -##ware -##リー -980 -ho -##pro -text -##era -560 -bob -227 -##ub -##2008 -8891 -scp -avi -##zen -2022 -mi -wu -museum -qvod -apache -lake -jcb -▲topaug -★★★ -ni -##hr -hill -302 -ne -weibo -490 -ruby -##ーシ -##ヶ -##row -4d -▲topjul -iv -##ish -github -306 -mate -312 -##スト -##lot -##ane -andrew -のハイト -##tina -t1 -rf -ed2k -##vel -##900 -way -final -りの -ns -5a -705 -197 -##メ -sweet -bytes -##ene -▲topjan -231 -##cker -##2007 -##px -100g -topapp -229 -helpapp -rs -low -14k -g4g -care -630 -ldquo -あり -##fork -leave -rm -edition -##gan -##zon -##qq -▲topsep -##google -##ism -gold -224 -explorer -##zer -toyota -category -select -visual -##labels -restaurant -##md -posts -s1 -##ico -もっと -angelababy -123456 -217 -sports -s3 -mbc -1915 -してくたさい -shell -x86 -candy -##new -kbs -face -xl -470 -##here -4a -swissinfo -v8 -▲topfeb -dram -##ual -##vice -3a -##wer -sport -q1 -ios10 -public -int -card -##c -ep -au -rt -##れた -1080 -bill -##mll -kim -30 -460 -wan -##uk -##ミ -x3 -298 -0t -scott -##ming -239 -e5 -##3d -h7n9 -worldcat -brown -##あります -##vo -##led -##580 -##ax -249 -410 -##ert -paris -##~6 -polo -925 -##lr -599 -##ナ -capital -##hing -bank -cv -1g -##chat -##s -##たい -adc -##ule -2m -##e -digital -hotmail -268 -##pad -870 -bbq -quot -##ring -before -wali -##まて -mcu -2k -2b -という -costco -316 -north -333 -switch -##city -##p -philips -##mann -management -panasonic -##cl -##vd -##ping -##rge -alice -##lk -##ましょう -css3 -##ney -vision -alpha -##ular -##400 -##tter -lz -にお -##ありません -mode -gre -1916 -pci -##tm -237 -1~2 -##yan -##そ -について -##let -##キ -work -war -coach -ah -mary -##ᅵ -huang -##pt -a8 -pt -follow -##berry -1895 -##ew -a5 -ghost -##ション -##wn -##og -south -##code -girls -##rid -action -villa -git -r11 -table -games -##cket -error -##anonymoussaid -##ag -here -##ame -##gc -qa -##■ -##lis -gmp -##gin -vmalife -##cher -yu -wedding -##tis -demo -dragon -530 -soho -social -bye -##rant -river -orz -acer -325 -##↑ -##ース -##ats -261 -del -##ven -440 -ups -##ように -##ター -305 -value -macd -yougou -##dn -661 -##ano -ll -##urt -##rent -continue -script -##wen -##ect -paper -263 -319 -shift -##chel -##フト -##cat -258 -x5 -fox -243 -##さん -car -aaa -##blog -loading -##yn -##tp -kuso -799 -si -sns -イカせるテンマ -ヒンクテンマ3 -rmb -vdc -forest -central -prime -help -ultra -##rmb -##ような -241 -square -688 -##しい -のないフロクに -##field -##reen -##ors -##ju -c1 -start -510 -##air -##map -cdn -##wo -cba -stephen -m8 -100km -##get -opera -##base -##ood -vsa -com™ -##aw -##ail -251 -なのて -count -t2 -##ᅡ -##een -2700 -hop -##gp -vsc -tree -##eg -##ose -816 -285 -##ories -##shop -alphago -v4 -1909 -simon -##ᆼ -fluke62max -zip -スホンサー -##sta -louis -cr -bas -##~10 -bc -##yer -hadoop -##ube -##wi -1906 -0755 -hola -##low -place -centre -5v -d3 -##fer -252 -##750 -##media -281 -540 -0l -exchange -262 -series -##ハー -##san -eb -##bank -##k -q3 -##nge -##mail -take -##lp -259 -1888 -client -east -cache -event -vincent -##ールを -きを -##nse -sui -855 -adchoice -##и -##stry -##なたの -246 -##zone -ga -apps -sea -##ab -248 -cisco -##タ -##rner -kymco -##care -dha -##pu -##yi -minkoff -royal -p1 -への -annie -269 -collection -kpi -playstation -257 -になります -866 -bh -##bar -queen -505 -radio -1904 -andy -armani -##xy -manager -iherb -##ery -##share -spring -raid -johnson -1908 -##ob -volvo -hall -##ball -v6 -our -taylor -##hk -bi -242 -##cp -kate -bo -water -technology -##rie -サイトは -277 -##ona -##sl -hpv -303 -gtx -hip -rdquo -jayz -stone -##lex -##rum -namespace -##やり -620 -##ale -##atic -des -##erson -##ql -##ves -##type -enter -##この -##てきます -d2 -##168 -##mix -##bian -との -a9 -jj -ky -##lc -access -movie -##hc -リストに -tower -##ration -##mit -ます -##nch -ua -tel -prefix -##o2 -1907 -##point -1901 -ott -~10 -##http -##ury -baidu -##ink -member -##logy -bigbang -nownews -##js -##shot -##tb -##こと -247 -eba -##tics -##lus -ける -v5 -spark -##ama -there -##ions -god -##lls -##down -hiv -##ress -burberry -day2 -##kv -◆◆ -jeff -related -film -edit -joseph -283 -##ark -cx -32gb -order -g9 -30000 -##ans -##tty -s5 -##bee -かあります -thread -xr -buy -sh -005 -land -spotify -mx -##ari -276 -##verse -×email -sf -why -##ことて -244 -7headlines -nego -sunny -dom -exo -401 -666 -positioning -fit -rgb -##tton -278 -kiss -alexa -adam -lp -みリストを -##g -mp -##ties -##llow -amy -##du -np -002 -institute -271 -##rth -##lar -2345 -590 -##des -sidebar -15 -imax -site -##cky -##kit -##ime -##009 -season -323 -##fun -##ンター -##ひ -gogoro -a7 -pu -lily -fire -twd600 -##ッセーシを -いて -##vis -30ml -##cture -##をお -information -##オ -close -friday -##くれる -yi -nick -てすか -##tta -##tel -6500 -##lock -cbd -economy -254 -かお -267 -tinker -double -375 -8gb -voice -##app -oops -channel -today -985 -##right -raw -xyz -##+ -jim -edm -##cent -7500 -supreme -814 -ds -##its -##asia -dropbox -##てすか -##tti -books -272 -100ml -##tle -##ller -##ken -##more -##boy -sex -309 -##dom -t3 -##ider -##なります -##unch -1903 -810 -feel -5500 -##かった -##put -により -s2 -mo -##gh -men -ka -amoled -div -##tr -##n1 -port -howard -##tags -ken -dnf -##nus -adsense -##а -ide -##へ -buff -thunder -##town -##ique -has -##body -auto -pin -##erry -tee -てした -295 -number -##the -##013 -object -psp -cool -udnbkk -16gb -##mic -miui -##tro -most -r2 -##alk -##nity -1880 -±0 -##いました -428 -s4 -law -version -##oa -n1 -sgs -docomo -##tf -##ack -henry -fc2 -##ded -##sco -##014 -##rite -286 -0mm -linkedin -##ada -##now -wii -##ndy -ucbug -##◎ -sputniknews -legalminer -##ika -##xp -2gb -##bu -q10 -oo -b6 -come -##rman -cheese -ming -maker -##gm -nikon -##fig -ppi -kelly -##ります -jchere -てきます -ted -md -003 -fgo -tech -##tto -dan -soc -##gl -##len -hair -earth -640 -521 -img -##pper -##a1 -##てきる -##ロク -acca -##ition -##ference -suite -##ig -outlook -##mond -##cation -398 -##pr -279 -101vip -358 -##999 -282 -64gb -3800 -345 -airport -##over -284 -##おり -jones -##ith -lab -##su -##いるのて -co2 -town -piece -##llo -no1 -vmware -24h -##qi -focus -reader -##admin -##ora -tb -false -##log -1898 -know -lan -838 -##ces -f4 -##ume -motel -stop -##oper -na -flickr -netcomponents -##af -##─ -pose -williams -local -##ound -##cg -##site -##iko -いお -274 -5m -gsm -con -##ath -1902 -friends -##hip -cell -317 -##rey -780 -cream -##cks -012 -##dp -facebooktwitterpinterestgoogle -sso -324 -shtml -song -swiss -##mw -##キンク -lumia -xdd -string -tiffany -522 -marc -られた -insee -russell -sc -dell -##ations -ok -camera -289 -##vs -##flow -##late -classic -287 -##nter -stay -g1 -mtv -512 -##ever -##lab -##nger -qe -sata -ryan -d1 -50ml -cms -##cing -su -292 -3300 -editor -296 -##nap -security -sunday -association -##ens -##700 -##bra -acg -##かり -sofascore -とは -mkv -##ign -jonathan -gary -build -labels -##oto -tesla -moba -qi -gohappy -general -ajax -1024 -##かる -サイト -society -##test -##urs -wps -fedora -##ich -mozilla -328 -##480 -##dr -usa -urn -##lina -##r -grace -##die -##try -##ader -1250 -##なり -elle -570 -##chen -##ᆯ -price -##ten -uhz -##ough -eq -##hen -states -push -session -balance -wow -506 -##cus -##py -when -##ward -##ep -34e -wong -library -prada -##サイト -##cle -running -##ree -313 -ck -date -q4 -##ctive -##ool -##> -mk -##ira -##163 -388 -die -secret -rq -dota -buffet -は1ヶ -e6 -##ez -pan -368 -ha -##card -##cha -2a -##さ -alan -day3 -eye -f3 -##end -france -keep -adi -rna -tvbs -##ala -solo -nova -##え -##tail -##ょう -support -##ries -##なる -##ved -base -copy -iis -fps -##ways -hero -hgih -profile -fish -mu -ssh -entertainment -chang -##wd -click -cake -##ond -pre -##tom -kic -pixel -##ov -##fl -product -6a -##pd -dear -##gate -es -yumi -audio -##² -##sky -echo -bin -where -##ture -329 -##ape -find -sap -isis -##なと -nand -##101 -##load -##ream -band -a6 -525 -never -##post -festival -50cm -##we -555 -guide -314 -zenfone -##ike -335 -gd -forum -jessica -strong -alexander -##ould -software -allen -##ious -program -360° -else -lohasthree -##gar -することかてきます -please -##れます -rc -##ggle -##ric -bim -50000 -##own -eclipse -355 -brian -3ds -##side -061 -361 -##other -##ける -##tech -##ator -485 -engine -##ged -##t -plaza -##fit -cia -ngo -westbrook -shi -tbs -50mm -##みませんか -sci -291 -reuters -##ily -contextlink -##hn -af -##cil -bridge -very -##cel -1890 -cambridge -##ize -15g -##aid -##data -790 -frm -##head -award -butler -##sun -meta -##mar -america -ps3 -puma -pmid -##すか -lc -670 -kitchen -##lic -オーフン5 -きなしソフトサーヒス -そして -day1 -future -★★★★ -##text -##page -##rris -pm1 -##ket -fans -##っています -1001 -christian -bot -kids -trackback -##hai -c3 -display -##hl -n2 -1896 -idea -さんも -##sent -airmail -##ug -##men -pwm -けます -028 -##lution -369 -852 -awards -schemas -354 -asics -wikipedia -font -##tional -##vy -c2 -293 -##れている -##dget -##ein -っている -contact -pepper -スキル -339 -##~5 -294 -##uel -##ument -730 -##hang -みてす -q5 -##sue -rain -##ndi -wei -swatch -##cept -わせ -331 -popular -##ste -##tag -p2 -501 -trc -1899 -##west -##live -justin -honda -ping -messenger -##rap -v9 -543 -##とは -unity -appqq -はすへて -025 -leo -##tone -##テ -##ass -uniqlo -##010 -502 -her -jane -memory -moneydj -##tical -human -12306 -していると -##m2 -coc -miacare -##mn -tmt -##core -vim -kk -##may -fan -target -use -too -338 -435 -2050 -867 -737 -fast -##2c -services -##ope -omega -energy -##わ -pinkoi -1a -##なから -##rain -jackson -##ement -##シャンルの -374 -366 -そんな -p9 -rd -##ᆨ -1111 -##tier -##vic -zone -##│ -385 -690 -dl -isofix -cpa -m4 -322 -kimi -めて -davis -##lay -lulu -##uck -050 -weeks -qs -##hop -920 -##n -ae -##ear -~5 -eia -405 -##fly -korea -jpeg -boost -##ship -small -##リア -1860 -eur -297 -425 -valley -##iel -simple -##ude -rn -k2 -##ena -されます -non -patrick -しているから -##ナー -feed -5757 -30g -process -well -qqmei -##thing -they -aws -lu -pink -##ters -##kin -または -board -##vertisement -wine -##ien -unicode -##dge -r1 -359 -##tant -いを -##twitter -##3c -cool1 -される -##れて -##l -isp -##012 -standard -45㎡2 -402 -##150 -matt -##fu -326 -##iner -googlemsn -pixnetfacebookyahoo -##ラン -x7 -886 -##uce -メーカー -sao -##ev -##きました -##file -9678 -403 -xddd -shirt -6l -##rio -##hat -3mm -givenchy -ya -bang -##lio -monday -crystal -ロクイン -##abc -336 -head -890 -ubuntuforumwikilinuxpastechat -##vc -##~20 -##rity -cnc -7866 -ipv6 -null -1897 -##ost -yang -imsean -tiger -##fet -##ンス -352 -##= -dji -327 -ji -maria -##come -##んて -foundation -3100 -##beth -##なった -1m -601 -active -##aft -##don -3p -sr -349 -emma -##khz -living -415 -353 -1889 -341 -709 -457 -sas -x6 -##face -pptv -x4 -##mate -han -sophie -##jing -337 -fifa -##mand -other -sale -inwedding -##gn -てきちゃいます -##mmy -##pmlast -bad -nana -nbc -してみてくたさいね -なとはお -##wu -##かあります -##あ -note7 -single -##340 -せからこ -してくたさい♪この -しにはとんとんワークケートを -するとあなたにもっとマッチした -ならワークケートへ -もみつかっちゃうかも -ワークケートの -##bel -window -##dio -##ht -union -age -382 -14 -##ivity -##y -コメント -domain -neo -##isa -##lter -5k -f5 -steven -##cts -powerpoint -tft -self -g2 -ft -##テル -zol -##act -mwc -381 -343 -もう -nbapop -408 -てある -eds -ace -##room -previous -author -tomtom -il -##ets -hu -financial -☆☆☆ -っています -bp -5t -chi -1gb -##hg -fairmont -cross -008 -gay -h2 -function -##けて -356 -also -1b -625 -##ータ -##raph -1894 -3~5 -##ils -i3 -334 -avenue -##host -による -##bon -##tsu -message -navigation -50g -fintech -h6 -##ことを -8cm -##ject -##vas -##firm -credit -##wf -xxxx -form -##nor -##space -huawei -plan -json -sbl -##dc -machine -921 -392 -wish -##120 -##sol -windows7 -edward -##ために -development -washington -##nsis -lo -818 -##sio -##ym -##bor -planet -##~8 -##wt -ieee -gpa -##めて -camp -ann -gm -##tw -##oka -connect -##rss -##work -##atus -wall -chicken -soul -2mm -##times -fa -##ather -##cord -009 -##eep -hitachi -gui -harry -##pan -e1 -disney -##press -##ーション -wind -386 -frigidaire -##tl -liu -hsu -332 -basic -von -ev -いた -てきる -スホンサーサイト -learning -##ull -expedia -archives -change -##wei -santa -cut -ins -6gb -turbo -brand -cf1 -508 -004 -return -747 -##rip -h1 -##nis -##をこ -128gb -##にお -3t -application -しており -emc -rx -##oon -384 -quick -412 -15058 -wilson -wing -chapter -##bug -beyond -##cms -##dar -##oh -zoom -e2 -trip -sb -##nba -rcep -342 -aspx -ci -080 -gc -gnu -める -##count -advanced -dance -dv -##url -##ging -367 -8591 -am09 -shadow -battle -346 -##i -##cia -##という -emily -##のてす -##tation -host -ff -techorz -sars -##mini -##mporary -##ering -nc -4200 -798 -##next -cma -##mbps -##gas -##ift -##dot -##ィ -455 -##~17 -amana -##りの -426 -##ros -ir -00㎡1 -##eet -##ible -##↓ -710 -ˋ▽ˊ -##aka -dcs -iq -##v -l1 -##lor -maggie -##011 -##iu -588 -##~1 -830 -##gt -1tb -articles -create -##burg -##iki -database -fantasy -##rex -##cam -dlc -dean -##you -hard -path -gaming -victoria -maps -cb -##lee -##itor -overchicstoretvhome -systems -##xt -416 -p3 -sarah -760 -##nan -407 -486 -x9 -install -second -626 -##ann -##ph -##rcle -##nic -860 -##nar -ec -##とう -768 -metro -chocolate -##rian -~4 -##table -##しています -skin -##sn -395 -mountain -##0mm -inparadise -6m -7x24 -ib -4800 -##jia -eeworld -creative -g5 -g3 -357 -parker -ecfa -village -からの -18000 -sylvia -サーヒス -hbl -##ques -##onsored -##x2 -##きます -##v4 -##tein -ie6 -383 -##stack -389 -ver -##ads -##baby -sound -bbe -##110 -##lone -##uid -ads -022 -gundam -351 -thinkpad -006 -scrum -match -##ave -mems -##470 -##oy -##なりました -##talk -glass -lamigo -span -##eme -job -##a5 -jay -wade -kde -498 -##lace -ocean -tvg -##covery -##r3 -##ners -##rea -junior -think -##aine -cover -##ision -##sia -↓↓ -##bow -msi -413 -458 -406 -##love -711 -801 -soft -z2 -##pl -456 -1840 -mobil -mind -##uy -427 -nginx -##oi -めた -##rr -6221 -##mple -##sson -##ーシてす -371 -##nts -91tv -comhd -crv3000 -##uard -1868 -397 -deep -lost -field -gallery -##bia -rate -spf -redis -traction -930 -icloud -011 -なら -fe -jose -372 -##tory -into -sohu -fx -899 -379 -kicstart2 -##hia -すく -##~3 -##sit -ra -24 -##walk -##xure -500g -##pact -pacific -xa -natural -carlo -##250 -##walker -1850 -##can -cto -gigi -516 -##サー -pen -##hoo -ob -matlab -##b -##yy -13913459 -##iti -mango -##bbs -sense -c5 -oxford -##ニア -walker -jennifer -##ola -course -##bre -701 -##pus -##rder -lucky -075 -##ぁ -ivy -なお -##nia -sotheby -side -##ugh -joy -##orage -##ush -##bat -##dt -364 -r9 -##2d -##gio -511 -country -wear -##lax -##~7 -##moon -393 -seven -study -411 -348 -lonzo -8k -##ェ -evolution -##イフ -##kk -gs -kd -##レス -arduino -344 -b12 -##lux -arpg -##rdon -cook -##x5 -dark -five -##als -##ida -とても -sign -362 -##ちの -something -20mm -##nda -387 -##posted -fresh -tf -1870 -422 -cam -##mine -##skip -##form -##ssion -education -394 -##tee -dyson -stage -##jie -want -##night -epson -pack -あります -##ppy -テリヘル -##█ -wd -##eh -##rence -left -##lvin -golden -mhz -discovery -##trix -##n2 -loft -##uch -##dra -##sse -speed -~1 -1mdb -sorry -welcome -##urn -wave -gaga -##lmer -teddy -##160 -トラックハック -せよ -611 -##f2016 -378 -rp -##sha -rar -##あなたに -##きた -840 -holiday -##ュー -373 -074 -##vg -##nos -##rail -gartner -gi -6p -##dium -kit -488 -b3 -eco -##ろう -20g -sean -##stone -autocad -nu -##np -f16 -write -029 -m5 -##ias -images -atp -##dk -fsm -504 -1350 -ve -52kb -##xxx -##のに -##cake -414 -unit -lim -ru -1v -##ification -published -angela -16g -analytics -ak -##q -##nel -gmt -##icon -again -##₂ -##bby -ios11 -445 -かこさいます -waze -いてす -##ハ -9985 -##ust -##ティー -framework -##007 -iptv -delete -52sykb -cl -wwdc -027 -30cm -##fw -##ての -1389 -##xon -brandt -##ses -##dragon -tc -vetements -anne -monte -modern -official -##へて -##ere -##nne -##oud -もちろん -50 -etnews -##a2 -##graphy -421 -863 -##ちゃん -444 -##rtex -##てお -l2 -##gma -mount -ccd -たと -archive -morning -tan -ddos -e7 -##ホ -day4 -##ウ -gis -453 -its -495 -factory -bruce -pg -##ito -ってくたさい -guest -cdma -##lling -536 -n3 -しかし -3~4 -mega -eyes -ro -13 -women -dac -church -##jun -singapore -##facebook -6991 -starbucks -##tos -##stin -##shine -zen -##mu -tina -20℃ -1893 -##たけて -503 -465 -request -##gence -qt -##っ -1886 -347 -363 -q7 -##zzi -diary -##tore -409 -##ead -468 -cst -##osa -canada -agent -va -##jiang -##ちは -##ーク -##lam -sg -##nix -##sday -##よって -g6 -##master -bing -##zl -charlie -16 -8mm -nb40 -##ーン -thai -##ルフ -ln284ct -##itz -##2f -bonnie -##food -##lent -originals -##stro -##lts -418 -∟∣ -##bscribe -children -ntd -yesstyle -##かも -hmv -##tment -d5 -2cm -arts -sms -##pn -##я -##いい -topios9 -539 -lifestyle -virtual -##ague -xz -##deo -muji -024 -unt -##nnis -##ᅩ -faq1 -1884 -396 -##ette -fly -64㎡ -はしめまして -441 -curry -##pop -のこ -release -##← -##◆◆ -##cast -073 -ありな -500ml -##ews -5c -##stle -ios7 -##ima -787 -dog -lenovo -##r4 -roger -013 -cbs -vornado -100m -417 -##desk -##クok -##ald -1867 -9595 -2900 -##van -oil -##x -some -break -common -##jy -##lines -g7 -twice -419 -ella -nano -belle -にこ -##mes -##self -##note -jb -##ことかてきます -benz -##との -##ova -451 -save -##wing -##ますのて -kai -りは -##hua -##rect -rainer -##unge -448 -##0m -adsl -##かな -guestname -##uma -##kins -##zu -tokichoi -##price -county -##med -##mus -rmk -391 -address -vm -えて -openload -##group -##hin -##iginal -amg -urban -##oz -jobs -emi -##public -beautiful -##sch -album -##dden -##bell -jerry -works -hostel -miller -##drive -##rmin -##10 -376 -boot -828 -##370 -##fx -##cm~ -1885 -##nome -##ctionary -##oman -##lish -##cr -##hm -433 -##how -432 -francis -xi -c919 -b5 -evernote -##uc -vga -##3000 -coupe -##urg -##cca -##uality -019 -6g -れる -multi -##また -##ett -em -hey -##ani -##tax -##rma -inside -than -740 -leonnhurt -##jin -ict -れた -bird -notes -200mm -くの -##dical -##lli -result -442 -iu -ee -438 -smap -gopro -##last -yin -pure -998 -32g -けた -5kg -##dan -##rame -mama -##oot -bean -marketing -##hur -2l -bella -sync -xuite -##ground -515 -discuz -##getrelax -##ince -##bay -##5s -cj -##イス -gmat -apt -##pass -jing -##rix -c4 -rich -##とても -niusnews -##ello -bag -770 -##eting -##mobile -18 -culture -015 -##のてすか -377 -1020 -area -##ience -616 -details -gp -universal -silver -dit -はお -private -ddd -u11 -kanshu -##ified -fung -##nny -dx -##520 -tai -475 -023 -##fr -##lean -3s -##pin -429 -##rin -25000 -ly -rick -##bility -usb3 -banner -##baru -##gion -metal -dt -vdf -1871 -karl -qualcomm -bear -1010 -oldid -ian -jo -##tors -population -##ernel -1882 -mmorpg -##mv -##bike -603 -##© -ww -friend -##ager -exhibition -##del -##pods -fpx -structure -##free -##tings -kl -##rley -##copyright -##mma -california -3400 -orange -yoga -4l -canmake -honey -##anda -##コメント -595 -nikkie -##ルハイト -dhl -publishing -##mall -##gnet -20cm -513 -##クセス -##┅ -e88 -970 -##dog -fishbase -##! -##" -### -##$ -##% -##& -##' -##( -##) -##* -##+ -##, -##- -##. -##/ -##: -##; -##< -##= -##> -##? -##@ -##[ -##\ -##] -##^ -##_ -##{ -##| -##} -##~ -##£ -##¤ -##¥ -##§ -##« -##± -##³ -##µ -##· -##¹ -##º -##» -##¼ -##ß -##æ -##÷ -##ø -##đ -##ŋ -##ɔ -##ə -##ɡ -##ʰ -##ˇ -##ˈ -##ˊ -##ˋ -##ˍ -##ː -##˙ -##˚ -##ˢ -##α -##β -##γ -##δ -##ε -##η -##θ -##ι -##κ -##λ -##μ -##ν -##ο -##π -##ρ -##ς -##σ -##τ -##υ -##φ -##χ -##ψ -##б -##в -##г -##д -##е -##ж -##з -##к -##л -##м -##н -##о -##п -##р -##с -##т -##у -##ф -##х -##ц -##ч -##ш -##ы -##ь -##і -##ا -##ب -##ة -##ت -##د -##ر -##س -##ع -##ل -##م -##ن -##ه -##و -##ي -##۩ -##ก -##ง -##น -##ม -##ย -##ร -##อ -##า -##เ -##๑ -##་ -##ღ -##ᄀ -##ᄁ -##ᄂ -##ᄃ -##ᄅ -##ᄆ -##ᄇ -##ᄈ -##ᄉ -##ᄋ -##ᄌ -##ᄎ -##ᄏ -##ᄐ -##ᄑ -##ᄒ -##ᅢ -##ᅣ -##ᅥ -##ᅦ -##ᅧ -##ᅨ -##ᅪ -##ᅬ -##ᅭ -##ᅮ -##ᅯ -##ᅲ -##ᅳ -##ᅴ -##ᆷ -##ᆸ -##ᆺ -##ᆻ -##ᗜ -##ᵃ -##ᵉ -##ᵍ -##ᵏ -##ᵐ -##ᵒ -##ᵘ -##‖ -##„ -##† -##• -##‥ -##‧ -##
 -##‰ -##′ -##″ -##‹ -##› -##※ -##‿ -##⁄ -##ⁱ -##⁺ -##ⁿ -##₁ -##₃ -##₄ -##€ -##№ -##ⅰ -##ⅱ -##ⅲ -##ⅳ -##ⅴ -##↔ -##↗ -##↘ -##⇒ -##∀ -##− -##∕ -##∙ -##√ -##∞ -##∟ -##∠ -##∣ -##∩ -##∮ -##∶ -##∼ -##∽ -##≈ -##≒ -##≡ -##≤ -##≥ -##≦ -##≧ -##≪ -##≫ -##⊙ -##⋅ -##⋈ -##⋯ -##⌒ -##① -##② -##③ -##④ -##⑤ -##⑥ -##⑦ -##⑧ -##⑨ -##⑩ -##⑴ -##⑵ -##⑶ -##⑷ -##⑸ -##⒈ -##⒉ -##⒊ -##⒋ -##ⓒ -##ⓔ -##ⓘ -##━ -##┃ -##┆ -##┊ -##┌ -##└ -##├ -##┣ -##═ -##║ -##╚ -##╞ -##╠ -##╭ -##╮ -##╯ -##╰ -##╱ -##╳ -##▂ -##▃ -##▅ -##▇ -##▉ -##▋ -##▌ -##▍ -##▎ -##□ -##▪ -##▫ -##▬ -##△ -##▶ -##► -##▽ -##◇ -##◕ -##◠ -##◢ -##◤ -##☀ -##☕ -##☞ -##☺ -##☼ -##♀ -##♂ -##♠ -##♡ -##♣ -##♦ -##♫ -##♬ -##✈ -##✔ -##✕ -##✖ -##✦ -##✨ -##✪ -##✰ -##✿ -##❀ -##➜ -##➤ -##⦿ -##、 -##。 -##〃 -##々 -##〇 -##〈 -##〉 -##《 -##》 -##「 -##」 -##『 -##』 -##【 -##】 -##〓 -##〔 -##〕 -##〖 -##〗 -##〜 -##〝 -##〞 -##ぃ -##ぇ -##ぬ -##ふ -##ほ -##む -##ゃ -##ゅ -##ゆ -##ょ -##゜ -##ゝ -##ァ -##ゥ -##エ -##ォ -##ケ -##サ -##セ -##ソ -##ッ -##ニ -##ヌ -##ネ -##ノ -##ヘ -##モ -##ャ -##ヤ -##ュ -##ユ -##ョ -##ヨ -##ワ -##ヲ -##・ -##ヽ -##ㄅ -##ㄆ -##ㄇ -##ㄉ -##ㄋ -##ㄌ -##ㄍ -##ㄎ -##ㄏ -##ㄒ -##ㄚ -##ㄛ -##ㄞ -##ㄟ -##ㄢ -##ㄤ -##ㄥ -##ㄧ -##ㄨ -##ㆍ -##㈦ -##㊣ -##㗎 -##一 -##丁 -##七 -##万 -##丈 -##三 -##上 -##下 -##不 -##与 -##丐 -##丑 -##专 -##且 -##丕 -##世 -##丘 -##丙 -##业 -##丛 -##东 -##丝 -##丞 -##丟 -##両 -##丢 -##两 -##严 -##並 -##丧 -##丨 -##个 -##丫 -##中 -##丰 -##串 -##临 -##丶 -##丸 -##丹 -##为 -##主 -##丼 -##丽 -##举 -##丿 -##乂 -##乃 -##久 -##么 -##义 -##之 -##乌 -##乍 -##乎 -##乏 -##乐 -##乒 -##乓 -##乔 -##乖 -##乗 -##乘 -##乙 -##乜 -##九 -##乞 -##也 -##习 -##乡 -##书 -##乩 -##买 -##乱 -##乳 -##乾 -##亀 -##亂 -##了 -##予 -##争 -##事 -##二 -##于 -##亏 -##云 -##互 -##五 -##井 -##亘 -##亙 -##亚 -##些 -##亜 -##亞 -##亟 -##亡 -##亢 -##交 -##亥 -##亦 -##产 -##亨 -##亩 -##享 -##京 -##亭 -##亮 -##亲 -##亳 -##亵 -##人 -##亿 -##什 -##仁 -##仃 -##仄 -##仅 -##仆 -##仇 -##今 -##介 -##仍 -##从 -##仏 -##仑 -##仓 -##仔 -##仕 -##他 -##仗 -##付 -##仙 -##仝 -##仞 -##仟 -##代 -##令 -##以 -##仨 -##仪 -##们 -##仮 -##仰 -##仲 -##件 -##价 -##任 -##份 -##仿 -##企 -##伉 -##伊 -##伍 -##伎 -##伏 -##伐 -##休 -##伕 -##众 -##优 -##伙 -##会 -##伝 -##伞 -##伟 -##传 -##伢 -##伤 -##伦 -##伪 -##伫 -##伯 -##估 -##伴 -##伶 -##伸 -##伺 -##似 -##伽 -##佃 -##但 -##佇 -##佈 -##位 -##低 -##住 -##佐 -##佑 -##体 -##佔 -##何 -##佗 -##佘 -##余 -##佚 -##佛 -##作 -##佝 -##佞 -##佟 -##你 -##佢 -##佣 -##佤 -##佥 -##佩 -##佬 -##佯 -##佰 -##佳 -##併 -##佶 -##佻 -##佼 -##使 -##侃 -##侄 -##來 -##侈 -##例 -##侍 -##侏 -##侑 -##侖 -##侗 -##供 -##依 -##侠 -##価 -##侣 -##侥 -##侦 -##侧 -##侨 -##侬 -##侮 -##侯 -##侵 -##侶 -##侷 -##便 -##係 -##促 -##俄 -##俊 -##俎 -##俏 -##俐 -##俑 -##俗 -##俘 -##俚 -##保 -##俞 -##俟 -##俠 -##信 -##俨 -##俩 -##俪 -##俬 -##俭 -##修 -##俯 -##俱 -##俳 -##俸 -##俺 -##俾 -##倆 -##倉 -##個 -##倌 -##倍 -##倏 -##們 -##倒 -##倔 -##倖 -##倘 -##候 -##倚 -##倜 -##借 -##倡 -##値 -##倦 -##倩 -##倪 -##倫 -##倬 -##倭 -##倶 -##债 -##值 -##倾 -##偃 -##假 -##偈 -##偉 -##偌 -##偎 -##偏 -##偕 -##做 -##停 -##健 -##側 -##偵 -##偶 -##偷 -##偻 -##偽 -##偿 -##傀 -##傅 -##傍 -##傑 -##傘 -##備 -##傚 -##傢 -##傣 -##傥 -##储 -##傩 -##催 -##傭 -##傲 -##傳 -##債 -##傷 -##傻 -##傾 -##僅 -##働 -##像 -##僑 -##僕 -##僖 -##僚 -##僥 -##僧 -##僭 -##僮 -##僱 -##僵 -##價 -##僻 -##儀 -##儂 -##億 -##儆 -##儉 -##儋 -##儒 -##儕 -##儘 -##償 -##儡 -##優 -##儲 -##儷 -##儼 -##儿 -##兀 -##允 -##元 -##兄 -##充 -##兆 -##兇 -##先 -##光 -##克 -##兌 -##免 -##児 -##兑 -##兒 -##兔 -##兖 -##党 -##兜 -##兢 -##入 -##內 -##全 -##兩 -##八 -##公 -##六 -##兮 -##兰 -##共 -##兲 -##关 -##兴 -##兵 -##其 -##具 -##典 -##兹 -##养 -##兼 -##兽 -##冀 -##内 -##円 -##冇 -##冈 -##冉 -##冊 -##册 -##再 -##冏 -##冒 -##冕 -##冗 -##写 -##军 -##农 -##冠 -##冢 -##冤 -##冥 -##冨 -##冪 -##冬 -##冯 -##冰 -##冲 -##决 -##况 -##冶 -##冷 -##冻 -##冼 -##冽 -##冾 -##净 -##凄 -##准 -##凇 -##凈 -##凉 -##凋 -##凌 -##凍 -##减 -##凑 -##凛 -##凜 -##凝 -##几 -##凡 -##凤 -##処 -##凪 -##凭 -##凯 -##凰 -##凱 -##凳 -##凶 -##凸 -##凹 -##出 -##击 -##函 -##凿 -##刀 -##刁 -##刃 -##分 -##切 -##刈 -##刊 -##刍 -##刎 -##刑 -##划 -##列 -##刘 -##则 -##刚 -##创 -##初 -##删 -##判 -##別 -##刨 -##利 -##刪 -##别 -##刮 -##到 -##制 -##刷 -##券 -##刹 -##刺 -##刻 -##刽 -##剁 -##剂 -##剃 -##則 -##剉 -##削 -##剋 -##剌 -##前 -##剎 -##剐 -##剑 -##剔 -##剖 -##剛 -##剜 -##剝 -##剣 -##剤 -##剥 -##剧 -##剩 -##剪 -##副 -##割 -##創 -##剷 -##剽 -##剿 -##劃 -##劇 -##劈 -##劉 -##劊 -##劍 -##劏 -##劑 -##力 -##劝 -##办 -##功 -##加 -##务 -##劣 -##动 -##助 -##努 -##劫 -##劭 -##励 -##劲 -##劳 -##労 -##劵 -##効 -##劾 -##势 -##勁 -##勃 -##勇 -##勉 -##勋 -##勐 -##勒 -##動 -##勖 -##勘 -##務 -##勛 -##勝 -##勞 -##募 -##勢 -##勤 -##勧 -##勳 -##勵 -##勸 -##勺 -##勻 -##勾 -##勿 -##匀 -##包 -##匆 -##匈 -##匍 -##匐 -##匕 -##化 -##北 -##匙 -##匝 -##匠 -##匡 -##匣 -##匪 -##匮 -##匯 -##匱 -##匹 -##区 -##医 -##匾 -##匿 -##區 -##十 -##千 -##卅 -##升 -##午 -##卉 -##半 -##卍 -##华 -##协 -##卑 -##卒 -##卓 -##協 -##单 -##卖 -##南 -##単 -##博 -##卜 -##卞 -##卟 -##占 -##卡 -##卢 -##卤 -##卦 -##卧 -##卫 -##卮 -##卯 -##印 -##危 -##即 -##却 -##卵 -##卷 -##卸 -##卻 -##卿 -##厂 -##厄 -##厅 -##历 -##厉 -##压 -##厌 -##厕 -##厘 -##厚 -##厝 -##原 -##厢 -##厥 -##厦 -##厨 -##厩 -##厭 -##厮 -##厲 -##厳 -##去 -##县 -##叁 -##参 -##參 -##又 -##叉 -##及 -##友 -##双 -##反 -##収 -##发 -##叔 -##取 -##受 -##变 -##叙 -##叛 -##叟 -##叠 -##叡 -##叢 -##口 -##古 -##句 -##另 -##叨 -##叩 -##只 -##叫 -##召 -##叭 -##叮 -##可 -##台 -##叱 -##史 -##右 -##叵 -##叶 -##号 -##司 -##叹 -##叻 -##叼 -##叽 -##吁 -##吃 -##各 -##吆 -##合 -##吉 -##吊 -##吋 -##同 -##名 -##后 -##吏 -##吐 -##向 -##吒 -##吓 -##吕 -##吖 -##吗 -##君 -##吝 -##吞 -##吟 -##吠 -##吡 -##否 -##吧 -##吨 -##吩 -##含 -##听 -##吭 -##吮 -##启 -##吱 -##吳 -##吴 -##吵 -##吶 -##吸 -##吹 -##吻 -##吼 -##吽 -##吾 -##呀 -##呂 -##呃 -##呆 -##呈 -##告 -##呋 -##呎 -##呐 -##呓 -##呕 -##呗 -##员 -##呛 -##呜 -##呢 -##呤 -##呦 -##周 -##呱 -##呲 -##味 -##呵 -##呷 -##呸 -##呻 -##呼 -##命 -##咀 -##咁 -##咂 -##咄 -##咆 -##咋 -##和 -##咎 -##咏 -##咐 -##咒 -##咔 -##咕 -##咖 -##咗 -##咘 -##咙 -##咚 -##咛 -##咣 -##咤 -##咦 -##咧 -##咨 -##咩 -##咪 -##咫 -##咬 -##咭 -##咯 -##咱 -##咲 -##咳 -##咸 -##咻 -##咽 -##咿 -##哀 -##品 -##哂 -##哄 -##哆 -##哇 -##哈 -##哉 -##哋 -##哌 -##响 -##哎 -##哏 -##哐 -##哑 -##哒 -##哔 -##哗 -##哟 -##員 -##哥 -##哦 -##哧 -##哨 -##哩 -##哪 -##哭 -##哮 -##哲 -##哺 -##哼 -##哽 -##唁 -##唄 -##唆 -##唇 -##唉 -##唏 -##唐 -##唑 -##唔 -##唠 -##唤 -##唧 -##唬 -##售 -##唯 -##唰 -##唱 -##唳 -##唷 -##唸 -##唾 -##啃 -##啄 -##商 -##啉 -##啊 -##問 -##啓 -##啕 -##啖 -##啜 -##啞 -##啟 -##啡 -##啤 -##啥 -##啦 -##啧 -##啪 -##啫 -##啬 -##啮 -##啰 -##啱 -##啲 -##啵 -##啶 -##啷 -##啸 -##啻 -##啼 -##啾 -##喀 -##喂 -##喃 -##善 -##喆 -##喇 -##喉 -##喊 -##喋 -##喎 -##喏 -##喔 -##喘 -##喙 -##喚 -##喜 -##喝 -##喟 -##喧 -##喪 -##喫 -##喬 -##單 -##喰 -##喱 -##喲 -##喳 -##喵 -##営 -##喷 -##喹 -##喺 -##喻 -##喽 -##嗅 -##嗆 -##嗇 -##嗎 -##嗑 -##嗒 -##嗓 -##嗔 -##嗖 -##嗚 -##嗜 -##嗝 -##嗟 -##嗡 -##嗣 -##嗤 -##嗦 -##嗨 -##嗪 -##嗬 -##嗯 -##嗰 -##嗲 -##嗳 -##嗶 -##嗷 -##嗽 -##嘀 -##嘅 -##嘆 -##嘈 -##嘉 -##嘌 -##嘍 -##嘎 -##嘔 -##嘖 -##嘗 -##嘘 -##嘚 -##嘛 -##嘜 -##嘞 -##嘟 -##嘢 -##嘣 -##嘤 -##嘧 -##嘩 -##嘭 -##嘮 -##嘯 -##嘰 -##嘱 -##嘲 -##嘴 -##嘶 -##嘸 -##嘹 -##嘻 -##嘿 -##噁 -##噌 -##噎 -##噓 -##噔 -##噗 -##噙 -##噜 -##噠 -##噢 -##噤 -##器 -##噩 -##噪 -##噬 -##噱 -##噴 -##噶 -##噸 -##噹 -##噻 -##噼 -##嚀 -##嚇 -##嚎 -##嚏 -##嚐 -##嚓 -##嚕 -##嚟 -##嚣 -##嚥 -##嚨 -##嚮 -##嚴 -##嚷 -##嚼 -##囂 -##囉 -##囊 -##囍 -##囑 -##囔 -##囗 -##囚 -##四 -##囝 -##回 -##囟 -##因 -##囡 -##团 -##団 -##囤 -##囧 -##囪 -##囫 -##园 -##困 -##囱 -##囲 -##図 -##围 -##囹 -##固 -##国 -##图 -##囿 -##圃 -##圄 -##圆 -##圈 -##國 -##圍 -##圏 -##園 -##圓 -##圖 -##團 -##圜 -##土 -##圣 -##圧 -##在 -##圩 -##圭 -##地 -##圳 -##场 -##圻 -##圾 -##址 -##坂 -##均 -##坊 -##坍 -##坎 -##坏 -##坐 -##坑 -##块 -##坚 -##坛 -##坝 -##坞 -##坟 -##坠 -##坡 -##坤 -##坦 -##坨 -##坪 -##坯 -##坳 -##坵 -##坷 -##垂 -##垃 -##垄 -##型 -##垒 -##垚 -##垛 -##垠 -##垢 -##垣 -##垦 -##垩 -##垫 -##垭 -##垮 -##垵 -##埂 -##埃 -##埋 -##城 -##埔 -##埕 -##埗 -##域 -##埠 -##埤 -##埵 -##執 -##埸 -##培 -##基 -##埼 -##堀 -##堂 -##堃 -##堅 -##堆 -##堇 -##堑 -##堕 -##堙 -##堡 -##堤 -##堪 -##堯 -##堰 -##報 -##場 -##堵 -##堺 -##堿 -##塊 -##塌 -##塑 -##塔 -##塗 -##塘 -##塚 -##塞 -##塢 -##塩 -##填 -##塬 -##塭 -##塵 -##塾 -##墀 -##境 -##墅 -##墉 -##墊 -##墒 -##墓 -##増 -##墘 -##墙 -##墜 -##增 -##墟 -##墨 -##墩 -##墮 -##墳 -##墻 -##墾 -##壁 -##壅 -##壆 -##壇 -##壊 -##壑 -##壓 -##壕 -##壘 -##壞 -##壟 -##壢 -##壤 -##壩 -##士 -##壬 -##壮 -##壯 -##声 -##売 -##壳 -##壶 -##壹 -##壺 -##壽 -##处 -##备 -##変 -##复 -##夏 -##夔 -##夕 -##外 -##夙 -##多 -##夜 -##够 -##夠 -##夢 -##夥 -##大 -##天 -##太 -##夫 -##夭 -##央 -##夯 -##失 -##头 -##夷 -##夸 -##夹 -##夺 -##夾 -##奂 -##奄 -##奇 -##奈 -##奉 -##奋 -##奎 -##奏 -##奐 -##契 -##奔 -##奕 -##奖 -##套 -##奘 -##奚 -##奠 -##奢 -##奥 -##奧 -##奪 -##奬 -##奮 -##女 -##奴 -##奶 -##奸 -##她 -##好 -##如 -##妃 -##妄 -##妆 -##妇 -##妈 -##妊 -##妍 -##妒 -##妓 -##妖 -##妘 -##妙 -##妝 -##妞 -##妣 -##妤 -##妥 -##妨 -##妩 -##妪 -##妮 -##妲 -##妳 -##妹 -##妻 -##妾 -##姆 -##姉 -##姊 -##始 -##姍 -##姐 -##姑 -##姒 -##姓 -##委 -##姗 -##姚 -##姜 -##姝 -##姣 -##姥 -##姦 -##姨 -##姪 -##姫 -##姬 -##姹 -##姻 -##姿 -##威 -##娃 -##娄 -##娅 -##娆 -##娇 -##娉 -##娑 -##娓 -##娘 -##娛 -##娜 -##娟 -##娠 -##娣 -##娥 -##娩 -##娱 -##娲 -##娴 -##娶 -##娼 -##婀 -##婁 -##婆 -##婉 -##婊 -##婕 -##婚 -##婢 -##婦 -##婧 -##婪 -##婭 -##婴 -##婵 -##婶 -##婷 -##婺 -##婿 -##媒 -##媚 -##媛 -##媞 -##媧 -##媲 -##媳 -##媽 -##媾 -##嫁 -##嫂 -##嫉 -##嫌 -##嫑 -##嫔 -##嫖 -##嫘 -##嫚 -##嫡 -##嫣 -##嫦 -##嫩 -##嫲 -##嫵 -##嫻 -##嬅 -##嬉 -##嬌 -##嬗 -##嬛 -##嬢 -##嬤 -##嬪 -##嬰 -##嬴 -##嬷 -##嬸 -##嬿 -##孀 -##孃 -##子 -##孑 -##孔 -##孕 -##孖 -##字 -##存 -##孙 -##孚 -##孛 -##孜 -##孝 -##孟 -##孢 -##季 -##孤 -##学 -##孩 -##孪 -##孫 -##孬 -##孰 -##孱 -##孳 -##孵 -##學 -##孺 -##孽 -##孿 -##宁 -##它 -##宅 -##宇 -##守 -##安 -##宋 -##完 -##宏 -##宓 -##宕 -##宗 -##官 -##宙 -##定 -##宛 -##宜 -##宝 -##实 -##実 -##宠 -##审 -##客 -##宣 -##室 -##宥 -##宦 -##宪 -##宫 -##宮 -##宰 -##害 -##宴 -##宵 -##家 -##宸 -##容 -##宽 -##宾 -##宿 -##寂 -##寄 -##寅 -##密 -##寇 -##富 -##寐 -##寒 -##寓 -##寛 -##寝 -##寞 -##察 -##寡 -##寢 -##寥 -##實 -##寧 -##寨 -##審 -##寫 -##寬 -##寮 -##寰 -##寵 -##寶 -##寸 -##对 -##寺 -##寻 -##导 -##対 -##寿 -##封 -##専 -##射 -##将 -##將 -##專 -##尉 -##尊 -##尋 -##對 -##導 -##小 -##少 -##尔 -##尕 -##尖 -##尘 -##尚 -##尝 -##尤 -##尧 -##尬 -##就 -##尴 -##尷 -##尸 -##尹 -##尺 -##尻 -##尼 -##尽 -##尾 -##尿 -##局 -##屁 -##层 -##屄 -##居 -##屆 -##屈 -##屉 -##届 -##屋 -##屌 -##屍 -##屎 -##屏 -##屐 -##屑 -##展 -##屜 -##属 -##屠 -##屡 -##屢 -##層 -##履 -##屬 -##屯 -##山 -##屹 -##屿 -##岀 -##岁 -##岂 -##岌 -##岐 -##岑 -##岔 -##岖 -##岗 -##岘 -##岙 -##岚 -##岛 -##岡 -##岩 -##岫 -##岬 -##岭 -##岱 -##岳 -##岷 -##岸 -##峇 -##峋 -##峒 -##峙 -##峡 -##峤 -##峥 -##峦 -##峨 -##峪 -##峭 -##峯 -##峰 -##峴 -##島 -##峻 -##峽 -##崁 -##崂 -##崆 -##崇 -##崎 -##崑 -##崔 -##崖 -##崗 -##崙 -##崛 -##崧 -##崩 -##崭 -##崴 -##崽 -##嵇 -##嵊 -##嵋 -##嵌 -##嵐 -##嵘 -##嵩 -##嵬 -##嵯 -##嶂 -##嶄 -##嶇 -##嶋 -##嶙 -##嶺 -##嶼 -##嶽 -##巅 -##巍 -##巒 -##巔 -##巖 -##川 -##州 -##巡 -##巢 -##工 -##左 -##巧 -##巨 -##巩 -##巫 -##差 -##己 -##已 -##巳 -##巴 -##巷 -##巻 -##巽 -##巾 -##巿 -##币 -##市 -##布 -##帅 -##帆 -##师 -##希 -##帐 -##帑 -##帕 -##帖 -##帘 -##帚 -##帛 -##帜 -##帝 -##帥 -##带 -##帧 -##師 -##席 -##帮 -##帯 -##帰 -##帳 -##帶 -##帷 -##常 -##帼 -##帽 -##幀 -##幂 -##幄 -##幅 -##幌 -##幔 -##幕 -##幟 -##幡 -##幢 -##幣 -##幫 -##干 -##平 -##年 -##并 -##幸 -##幹 -##幺 -##幻 -##幼 -##幽 -##幾 -##广 -##庁 -##広 -##庄 -##庆 -##庇 -##床 -##序 -##庐 -##库 -##应 -##底 -##庖 -##店 -##庙 -##庚 -##府 -##庞 -##废 -##庠 -##度 -##座 -##庫 -##庭 -##庵 -##庶 -##康 -##庸 -##庹 -##庾 -##廁 -##廂 -##廃 -##廈 -##廉 -##廊 -##廓 -##廖 -##廚 -##廝 -##廟 -##廠 -##廢 -##廣 -##廬 -##廳 -##延 -##廷 -##建 -##廿 -##开 -##弁 -##异 -##弃 -##弄 -##弈 -##弊 -##弋 -##式 -##弑 -##弒 -##弓 -##弔 -##引 -##弗 -##弘 -##弛 -##弟 -##张 -##弥 -##弦 -##弧 -##弩 -##弭 -##弯 -##弱 -##張 -##強 -##弹 -##强 -##弼 -##弾 -##彅 -##彆 -##彈 -##彌 -##彎 -##归 -##当 -##录 -##彗 -##彙 -##彝 -##形 -##彤 -##彥 -##彦 -##彧 -##彩 -##彪 -##彫 -##彬 -##彭 -##彰 -##影 -##彷 -##役 -##彻 -##彼 -##彿 -##往 -##征 -##径 -##待 -##徇 -##很 -##徉 -##徊 -##律 -##後 -##徐 -##徑 -##徒 -##従 -##徕 -##得 -##徘 -##徙 -##徜 -##從 -##徠 -##御 -##徨 -##復 -##循 -##徬 -##微 -##徳 -##徴 -##徵 -##德 -##徹 -##徼 -##徽 -##心 -##必 -##忆 -##忌 -##忍 -##忏 -##忐 -##忑 -##忒 -##忖 -##志 -##忘 -##忙 -##応 -##忠 -##忡 -##忤 -##忧 -##忪 -##快 -##忱 -##念 -##忻 -##忽 -##忿 -##怀 -##态 -##怂 -##怅 -##怆 -##怎 -##怏 -##怒 -##怔 -##怕 -##怖 -##怙 -##怜 -##思 -##怠 -##怡 -##急 -##怦 -##性 -##怨 -##怪 -##怯 -##怵 -##总 -##怼 -##恁 -##恃 -##恆 -##恋 -##恍 -##恐 -##恒 -##恕 -##恙 -##恚 -##恢 -##恣 -##恤 -##恥 -##恨 -##恩 -##恪 -##恫 -##恬 -##恭 -##息 -##恰 -##恳 -##恵 -##恶 -##恸 -##恺 -##恻 -##恼 -##恿 -##悄 -##悅 -##悉 -##悌 -##悍 -##悔 -##悖 -##悚 -##悟 -##悠 -##患 -##悦 -##您 -##悩 -##悪 -##悬 -##悯 -##悱 -##悲 -##悴 -##悵 -##悶 -##悸 -##悻 -##悼 -##悽 -##情 -##惆 -##惇 -##惊 -##惋 -##惑 -##惕 -##惘 -##惚 -##惜 -##惟 -##惠 -##惡 -##惦 -##惧 -##惨 -##惩 -##惫 -##惬 -##惭 -##惮 -##惯 -##惰 -##惱 -##想 -##惴 -##惶 -##惹 -##惺 -##愁 -##愆 -##愈 -##愉 -##愍 -##意 -##愕 -##愚 -##愛 -##愜 -##感 -##愣 -##愤 -##愧 -##愫 -##愷 -##愿 -##慄 -##慈 -##態 -##慌 -##慎 -##慑 -##慕 -##慘 -##慚 -##慟 -##慢 -##慣 -##慧 -##慨 -##慫 -##慮 -##慰 -##慳 -##慵 -##慶 -##慷 -##慾 -##憂 -##憊 -##憋 -##憎 -##憐 -##憑 -##憔 -##憚 -##憤 -##憧 -##憨 -##憩 -##憫 -##憬 -##憲 -##憶 -##憾 -##懂 -##懇 -##懈 -##應 -##懊 -##懋 -##懑 -##懒 -##懦 -##懲 -##懵 -##懶 -##懷 -##懸 -##懺 -##懼 -##懾 -##懿 -##戀 -##戈 -##戊 -##戌 -##戍 -##戎 -##戏 -##成 -##我 -##戒 -##戕 -##或 -##战 -##戚 -##戛 -##戟 -##戡 -##戦 -##截 -##戬 -##戮 -##戰 -##戲 -##戳 -##戴 -##戶 -##户 -##戸 -##戻 -##戾 -##房 -##所 -##扁 -##扇 -##扈 -##扉 -##手 -##才 -##扎 -##扑 -##扒 -##打 -##扔 -##払 -##托 -##扛 -##扣 -##扦 -##执 -##扩 -##扪 -##扫 -##扬 -##扭 -##扮 -##扯 -##扰 -##扱 -##扳 -##扶 -##批 -##扼 -##找 -##承 -##技 -##抄 -##抉 -##把 -##抑 -##抒 -##抓 -##投 -##抖 -##抗 -##折 -##抚 -##抛 -##抜 -##択 -##抟 -##抠 -##抡 -##抢 -##护 -##报 -##抨 -##披 -##抬 -##抱 -##抵 -##抹 -##押 -##抽 -##抿 -##拂 -##拄 -##担 -##拆 -##拇 -##拈 -##拉 -##拋 -##拌 -##拍 -##拎 -##拐 -##拒 -##拓 -##拔 -##拖 -##拗 -##拘 -##拙 -##拚 -##招 -##拜 -##拟 -##拡 -##拢 -##拣 -##拥 -##拦 -##拧 -##拨 -##择 -##括 -##拭 -##拮 -##拯 -##拱 -##拳 -##拴 -##拷 -##拼 -##拽 -##拾 -##拿 -##持 -##挂 -##指 -##挈 -##按 -##挎 -##挑 -##挖 -##挙 -##挚 -##挛 -##挝 -##挞 -##挟 -##挠 -##挡 -##挣 -##挤 -##挥 -##挨 -##挪 -##挫 -##振 -##挲 -##挹 -##挺 -##挽 -##挾 -##捂 -##捅 -##捆 -##捉 -##捋 -##捌 -##捍 -##捎 -##捏 -##捐 -##捕 -##捞 -##损 -##捡 -##换 -##捣 -##捧 -##捨 -##捩 -##据 -##捱 -##捲 -##捶 -##捷 -##捺 -##捻 -##掀 -##掂 -##掃 -##掇 -##授 -##掉 -##掌 -##掏 -##掐 -##排 -##掖 -##掘 -##掙 -##掛 -##掠 -##採 -##探 -##掣 -##接 -##控 -##推 -##掩 -##措 -##掬 -##掰 -##掲 -##掳 -##掴 -##掷 -##掸 -##掺 -##揀 -##揃 -##揄 -##揆 -##揉 -##揍 -##描 -##提 -##插 -##揖 -##揚 -##換 -##握 -##揣 -##揩 -##揪 -##揭 -##揮 -##援 -##揶 -##揸 -##揹 -##揽 -##搀 -##搁 -##搂 -##搅 -##損 -##搏 -##搐 -##搓 -##搔 -##搖 -##搗 -##搜 -##搞 -##搡 -##搪 -##搬 -##搭 -##搵 -##搶 -##携 -##搽 -##摀 -##摁 -##摄 -##摆 -##摇 -##摈 -##摊 -##摒 -##摔 -##摘 -##摞 -##摟 -##摧 -##摩 -##摯 -##摳 -##摸 -##摹 -##摺 -##摻 -##撂 -##撃 -##撅 -##撇 -##撈 -##撐 -##撑 -##撒 -##撓 -##撕 -##撚 -##撞 -##撤 -##撥 -##撩 -##撫 -##撬 -##播 -##撮 -##撰 -##撲 -##撵 -##撷 -##撸 -##撻 -##撼 -##撿 -##擀 -##擁 -##擂 -##擄 -##擅 -##擇 -##擊 -##擋 -##操 -##擎 -##擒 -##擔 -##擘 -##據 -##擞 -##擠 -##擡 -##擢 -##擦 -##擬 -##擰 -##擱 -##擲 -##擴 -##擷 -##擺 -##擼 -##擾 -##攀 -##攏 -##攒 -##攔 -##攘 -##攙 -##攜 -##攝 -##攞 -##攢 -##攣 -##攤 -##攥 -##攪 -##攫 -##攬 -##支 -##收 -##攸 -##改 -##攻 -##放 -##政 -##故 -##效 -##敌 -##敍 -##敎 -##敏 -##救 -##敕 -##敖 -##敗 -##敘 -##教 -##敛 -##敝 -##敞 -##敢 -##散 -##敦 -##敬 -##数 -##敲 -##整 -##敵 -##敷 -##數 -##斂 -##斃 -##文 -##斋 -##斌 -##斎 -##斐 -##斑 -##斓 -##斗 -##料 -##斛 -##斜 -##斟 -##斡 -##斤 -##斥 -##斧 -##斩 -##斫 -##斬 -##断 -##斯 -##新 -##斷 -##方 -##於 -##施 -##旁 -##旃 -##旅 -##旋 -##旌 -##旎 -##族 -##旖 -##旗 -##无 -##既 -##日 -##旦 -##旧 -##旨 -##早 -##旬 -##旭 -##旮 -##旱 -##时 -##旷 -##旺 -##旻 -##昀 -##昂 -##昆 -##昇 -##昉 -##昊 -##昌 -##明 -##昏 -##易 -##昔 -##昕 -##昙 -##星 -##映 -##春 -##昧 -##昨 -##昭 -##是 -##昱 -##昴 -##昵 -##昶 -##昼 -##显 -##晁 -##時 -##晃 -##晉 -##晋 -##晌 -##晏 -##晒 -##晓 -##晔 -##晕 -##晖 -##晗 -##晚 -##晝 -##晞 -##晟 -##晤 -##晦 -##晨 -##晩 -##普 -##景 -##晰 -##晴 -##晶 -##晷 -##智 -##晾 -##暂 -##暄 -##暇 -##暈 -##暉 -##暌 -##暐 -##暑 -##暖 -##暗 -##暝 -##暢 -##暧 -##暨 -##暫 -##暮 -##暱 -##暴 -##暸 -##暹 -##曄 -##曆 -##曇 -##曉 -##曖 -##曙 -##曜 -##曝 -##曠 -##曦 -##曬 -##曰 -##曲 -##曳 -##更 -##書 -##曹 -##曼 -##曾 -##替 -##最 -##會 -##月 -##有 -##朋 -##服 -##朐 -##朔 -##朕 -##朗 -##望 -##朝 -##期 -##朦 -##朧 -##木 -##未 -##末 -##本 -##札 -##朮 -##术 -##朱 -##朴 -##朵 -##机 -##朽 -##杀 -##杂 -##权 -##杆 -##杈 -##杉 -##李 -##杏 -##材 -##村 -##杓 -##杖 -##杜 -##杞 -##束 -##杠 -##条 -##来 -##杨 -##杭 -##杯 -##杰 -##東 -##杳 -##杵 -##杷 -##杼 -##松 -##板 -##极 -##构 -##枇 -##枉 -##枋 -##析 -##枕 -##林 -##枚 -##果 -##枝 -##枢 -##枣 -##枪 -##枫 -##枭 -##枯 -##枰 -##枱 -##枳 -##架 -##枷 -##枸 -##柄 -##柏 -##某 -##柑 -##柒 -##染 -##柔 -##柘 -##柚 -##柜 -##柞 -##柠 -##柢 -##查 -##柩 -##柬 -##柯 -##柱 -##柳 -##柴 -##柵 -##査 -##柿 -##栀 -##栃 -##栄 -##栅 -##标 -##栈 -##栉 -##栋 -##栎 -##栏 -##树 -##栓 -##栖 -##栗 -##校 -##栩 -##株 -##样 -##核 -##根 -##格 -##栽 -##栾 -##桀 -##桁 -##桂 -##桃 -##桅 -##框 -##案 -##桉 -##桌 -##桎 -##桐 -##桑 -##桓 -##桔 -##桜 -##桠 -##桡 -##桢 -##档 -##桥 -##桦 -##桧 -##桨 -##桩 -##桶 -##桿 -##梁 -##梅 -##梆 -##梏 -##梓 -##梗 -##條 -##梟 -##梢 -##梦 -##梧 -##梨 -##梭 -##梯 -##械 -##梳 -##梵 -##梶 -##检 -##棂 -##棄 -##棉 -##棋 -##棍 -##棒 -##棕 -##棗 -##棘 -##棚 -##棟 -##棠 -##棣 -##棧 -##森 -##棱 -##棲 -##棵 -##棹 -##棺 -##椁 -##椅 -##椋 -##植 -##椎 -##椒 -##検 -##椪 -##椭 -##椰 -##椹 -##椽 -##椿 -##楂 -##楊 -##楓 -##楔 -##楚 -##楝 -##楞 -##楠 -##楣 -##楨 -##楫 -##業 -##楮 -##極 -##楷 -##楸 -##楹 -##楼 -##楽 -##概 -##榄 -##榆 -##榈 -##榉 -##榔 -##榕 -##榖 -##榛 -##榜 -##榨 -##榫 -##榭 -##榮 -##榱 -##榴 -##榷 -##榻 -##槁 -##槃 -##構 -##槌 -##槍 -##槎 -##槐 -##槓 -##様 -##槛 -##槟 -##槤 -##槭 -##槲 -##槳 -##槻 -##槽 -##槿 -##樁 -##樂 -##樊 -##樑 -##樓 -##標 -##樞 -##樟 -##模 -##樣 -##権 -##横 -##樫 -##樯 -##樱 -##樵 -##樸 -##樹 -##樺 -##樽 -##樾 -##橄 -##橇 -##橋 -##橐 -##橘 -##橙 -##機 -##橡 -##橢 -##橫 -##橱 -##橹 -##橼 -##檀 -##檄 -##檎 -##檐 -##檔 -##檗 -##檜 -##檢 -##檬 -##檯 -##檳 -##檸 -##檻 -##櫃 -##櫚 -##櫛 -##櫥 -##櫸 -##櫻 -##欄 -##權 -##欒 -##欖 -##欠 -##次 -##欢 -##欣 -##欧 -##欲 -##欸 -##欺 -##欽 -##款 -##歆 -##歇 -##歉 -##歌 -##歎 -##歐 -##歓 -##歙 -##歛 -##歡 -##止 -##正 -##此 -##步 -##武 -##歧 -##歩 -##歪 -##歯 -##歲 -##歳 -##歴 -##歷 -##歸 -##歹 -##死 -##歼 -##殁 -##殃 -##殆 -##殇 -##殉 -##殊 -##残 -##殒 -##殓 -##殖 -##殘 -##殞 -##殡 -##殤 -##殭 -##殯 -##殲 -##殴 -##段 -##殷 -##殺 -##殼 -##殿 -##毀 -##毁 -##毂 -##毅 -##毆 -##毋 -##母 -##毎 -##每 -##毒 -##毓 -##比 -##毕 -##毗 -##毘 -##毙 -##毛 -##毡 -##毫 -##毯 -##毽 -##氈 -##氏 -##氐 -##民 -##氓 -##气 -##氖 -##気 -##氙 -##氛 -##氟 -##氡 -##氢 -##氣 -##氤 -##氦 -##氧 -##氨 -##氪 -##氫 -##氮 -##氯 -##氰 -##氲 -##水 -##氷 -##永 -##氹 -##氾 -##汀 -##汁 -##求 -##汆 -##汇 -##汉 -##汎 -##汐 -##汕 -##汗 -##汙 -##汛 -##汝 -##汞 -##江 -##池 -##污 -##汤 -##汨 -##汩 -##汪 -##汰 -##汲 -##汴 -##汶 -##汹 -##決 -##汽 -##汾 -##沁 -##沂 -##沃 -##沅 -##沈 -##沉 -##沌 -##沏 -##沐 -##沒 -##沓 -##沖 -##沙 -##沛 -##沟 -##没 -##沢 -##沣 -##沥 -##沦 -##沧 -##沪 -##沫 -##沭 -##沮 -##沱 -##河 -##沸 -##油 -##治 -##沼 -##沽 -##沾 -##沿 -##況 -##泄 -##泉 -##泊 -##泌 -##泓 -##法 -##泗 -##泛 -##泞 -##泠 -##泡 -##波 -##泣 -##泥 -##注 -##泪 -##泫 -##泮 -##泯 -##泰 -##泱 -##泳 -##泵 -##泷 -##泸 -##泻 -##泼 -##泽 -##泾 -##洁 -##洄 -##洋 -##洒 -##洗 -##洙 -##洛 -##洞 -##津 -##洩 -##洪 -##洮 -##洱 -##洲 -##洵 -##洶 -##洸 -##洹 -##活 -##洼 -##洽 -##派 -##流 -##浃 -##浄 -##浅 -##浆 -##浇 -##浊 -##测 -##济 -##浏 -##浑 -##浒 -##浓 -##浔 -##浙 -##浚 -##浜 -##浣 -##浦 -##浩 -##浪 -##浬 -##浮 -##浯 -##浴 -##海 -##浸 -##涂 -##涅 -##涇 -##消 -##涉 -##涌 -##涎 -##涓 -##涔 -##涕 -##涙 -##涛 -##涝 -##涞 -##涟 -##涠 -##涡 -##涣 -##涤 -##润 -##涧 -##涨 -##涩 -##涪 -##涮 -##涯 -##液 -##涵 -##涸 -##涼 -##涿 -##淀 -##淄 -##淅 -##淆 -##淇 -##淋 -##淌 -##淑 -##淒 -##淖 -##淘 -##淙 -##淚 -##淞 -##淡 -##淤 -##淦 -##淨 -##淩 -##淪 -##淫 -##淬 -##淮 -##深 -##淳 -##淵 -##混 -##淹 -##淺 -##添 -##淼 -##清 -##済 -##渉 -##渊 -##渋 -##渍 -##渎 -##渐 -##渔 -##渗 -##渙 -##渚 -##減 -##渝 -##渠 -##渡 -##渣 -##渤 -##渥 -##渦 -##温 -##測 -##渭 -##港 -##渲 -##渴 -##游 -##渺 -##渾 -##湃 -##湄 -##湊 -##湍 -##湖 -##湘 -##湛 -##湟 -##湧 -##湫 -##湮 -##湯 -##湳 -##湾 -##湿 -##満 -##溃 -##溅 -##溉 -##溏 -##源 -##準 -##溜 -##溝 -##溟 -##溢 -##溥 -##溧 -##溪 -##溫 -##溯 -##溱 -##溴 -##溶 -##溺 -##溼 -##滁 -##滂 -##滄 -##滅 -##滇 -##滋 -##滌 -##滑 -##滓 -##滔 -##滕 -##滙 -##滚 -##滝 -##滞 -##滟 -##满 -##滢 -##滤 -##滥 -##滦 -##滨 -##滩 -##滬 -##滯 -##滲 -##滴 -##滷 -##滸 -##滾 -##滿 -##漁 -##漂 -##漆 -##漉 -##漏 -##漓 -##演 -##漕 -##漠 -##漢 -##漣 -##漩 -##漪 -##漫 -##漬 -##漯 -##漱 -##漲 -##漳 -##漸 -##漾 -##漿 -##潆 -##潇 -##潋 -##潍 -##潑 -##潔 -##潘 -##潛 -##潜 -##潞 -##潟 -##潢 -##潤 -##潦 -##潧 -##潭 -##潮 -##潰 -##潴 -##潸 -##潺 -##潼 -##澀 -##澄 -##澆 -##澈 -##澍 -##澎 -##澗 -##澜 -##澡 -##澤 -##澧 -##澱 -##澳 -##澹 -##激 -##濁 -##濂 -##濃 -##濑 -##濒 -##濕 -##濘 -##濛 -##濟 -##濠 -##濡 -##濤 -##濫 -##濬 -##濮 -##濯 -##濱 -##濺 -##濾 -##瀅 -##瀆 -##瀉 -##瀋 -##瀏 -##瀑 -##瀕 -##瀘 -##瀚 -##瀛 -##瀝 -##瀞 -##瀟 -##瀧 -##瀨 -##瀬 -##瀰 -##瀾 -##灌 -##灏 -##灑 -##灘 -##灝 -##灞 -##灣 -##火 -##灬 -##灭 -##灯 -##灰 -##灵 -##灶 -##灸 -##灼 -##災 -##灾 -##灿 -##炀 -##炁 -##炅 -##炉 -##炊 -##炎 -##炒 -##炔 -##炕 -##炖 -##炙 -##炜 -##炫 -##炬 -##炭 -##炮 -##炯 -##炳 -##炷 -##炸 -##点 -##為 -##炼 -##炽 -##烁 -##烂 -##烃 -##烈 -##烊 -##烏 -##烘 -##烙 -##烛 -##烟 -##烤 -##烦 -##烧 -##烨 -##烩 -##烫 -##烬 -##热 -##烯 -##烷 -##烹 -##烽 -##焉 -##焊 -##焕 -##焖 -##焗 -##焘 -##焙 -##焚 -##焜 -##無 -##焦 -##焯 -##焰 -##焱 -##然 -##焼 -##煅 -##煉 -##煊 -##煌 -##煎 -##煒 -##煖 -##煙 -##煜 -##煞 -##煤 -##煥 -##煦 -##照 -##煨 -##煩 -##煮 -##煲 -##煸 -##煽 -##熄 -##熊 -##熏 -##熒 -##熔 -##熙 -##熟 -##熠 -##熨 -##熬 -##熱 -##熵 -##熹 -##熾 -##燁 -##燃 -##燄 -##燈 -##燉 -##燊 -##燎 -##燒 -##燔 -##燕 -##燙 -##燜 -##營 -##燥 -##燦 -##燧 -##燭 -##燮 -##燴 -##燻 -##燼 -##燿 -##爆 -##爍 -##爐 -##爛 -##爪 -##爬 -##爭 -##爰 -##爱 -##爲 -##爵 -##父 -##爷 -##爸 -##爹 -##爺 -##爻 -##爽 -##爾 -##牆 -##片 -##版 -##牌 -##牍 -##牒 -##牙 -##牛 -##牝 -##牟 -##牠 -##牡 -##牢 -##牦 -##牧 -##物 -##牯 -##牲 -##牴 -##牵 -##特 -##牺 -##牽 -##犀 -##犁 -##犄 -##犊 -##犍 -##犒 -##犢 -##犧 -##犬 -##犯 -##状 -##犷 -##犸 -##犹 -##狀 -##狂 -##狄 -##狈 -##狎 -##狐 -##狒 -##狗 -##狙 -##狞 -##狠 -##狡 -##狩 -##独 -##狭 -##狮 -##狰 -##狱 -##狸 -##狹 -##狼 -##狽 -##猎 -##猕 -##猖 -##猗 -##猙 -##猛 -##猜 -##猝 -##猥 -##猩 -##猪 -##猫 -##猬 -##献 -##猴 -##猶 -##猷 -##猾 -##猿 -##獄 -##獅 -##獎 -##獐 -##獒 -##獗 -##獠 -##獣 -##獨 -##獭 -##獰 -##獲 -##獵 -##獷 -##獸 -##獺 -##獻 -##獼 -##獾 -##玄 -##率 -##玉 -##王 -##玑 -##玖 -##玛 -##玟 -##玠 -##玥 -##玩 -##玫 -##玮 -##环 -##现 -##玲 -##玳 -##玷 -##玺 -##玻 -##珀 -##珂 -##珅 -##珈 -##珉 -##珊 -##珍 -##珏 -##珐 -##珑 -##珙 -##珞 -##珠 -##珣 -##珥 -##珩 -##珪 -##班 -##珮 -##珲 -##珺 -##現 -##球 -##琅 -##理 -##琇 -##琉 -##琊 -##琍 -##琏 -##琐 -##琛 -##琢 -##琥 -##琦 -##琨 -##琪 -##琬 -##琮 -##琰 -##琲 -##琳 -##琴 -##琵 -##琶 -##琺 -##琼 -##瑀 -##瑁 -##瑄 -##瑋 -##瑕 -##瑗 -##瑙 -##瑚 -##瑛 -##瑜 -##瑞 -##瑟 -##瑠 -##瑣 -##瑤 -##瑩 -##瑪 -##瑯 -##瑰 -##瑶 -##瑾 -##璀 -##璁 -##璃 -##璇 -##璉 -##璋 -##璎 -##璐 -##璜 -##璞 -##璟 -##璧 -##璨 -##環 -##璽 -##璿 -##瓊 -##瓏 -##瓒 -##瓜 -##瓢 -##瓣 -##瓤 -##瓦 -##瓮 -##瓯 -##瓴 -##瓶 -##瓷 -##甄 -##甌 -##甕 -##甘 -##甙 -##甚 -##甜 -##生 -##產 -##産 -##甥 -##甦 -##用 -##甩 -##甫 -##甬 -##甭 -##甯 -##田 -##由 -##甲 -##申 -##电 -##男 -##甸 -##町 -##画 -##甾 -##畀 -##畅 -##界 -##畏 -##畑 -##畔 -##留 -##畜 -##畝 -##畢 -##略 -##畦 -##番 -##畫 -##異 -##畲 -##畳 -##畴 -##當 -##畸 -##畹 -##畿 -##疆 -##疇 -##疊 -##疏 -##疑 -##疔 -##疖 -##疗 -##疙 -##疚 -##疝 -##疟 -##疡 -##疣 -##疤 -##疥 -##疫 -##疮 -##疯 -##疱 -##疲 -##疳 -##疵 -##疸 -##疹 -##疼 -##疽 -##疾 -##痂 -##病 -##症 -##痈 -##痉 -##痊 -##痍 -##痒 -##痔 -##痕 -##痘 -##痙 -##痛 -##痞 -##痠 -##痢 -##痣 -##痤 -##痧 -##痨 -##痪 -##痫 -##痰 -##痱 -##痴 -##痹 -##痺 -##痼 -##痿 -##瘀 -##瘁 -##瘋 -##瘍 -##瘓 -##瘘 -##瘙 -##瘟 -##瘠 -##瘡 -##瘢 -##瘤 -##瘦 -##瘧 -##瘩 -##瘪 -##瘫 -##瘴 -##瘸 -##瘾 -##療 -##癇 -##癌 -##癒 -##癖 -##癜 -##癞 -##癡 -##癢 -##癣 -##癥 -##癫 -##癬 -##癮 -##癱 -##癲 -##癸 -##発 -##登 -##發 -##白 -##百 -##皂 -##的 -##皆 -##皇 -##皈 -##皋 -##皎 -##皑 -##皓 -##皖 -##皙 -##皚 -##皮 -##皰 -##皱 -##皴 -##皺 -##皿 -##盂 -##盃 -##盅 -##盆 -##盈 -##益 -##盎 -##盏 -##盐 -##监 -##盒 -##盔 -##盖 -##盗 -##盘 -##盛 -##盜 -##盞 -##盟 -##盡 -##監 -##盤 -##盥 -##盧 -##盪 -##目 -##盯 -##盱 -##盲 -##直 -##相 -##盹 -##盼 -##盾 -##省 -##眈 -##眉 -##看 -##県 -##眙 -##眞 -##真 -##眠 -##眦 -##眨 -##眩 -##眯 -##眶 -##眷 -##眸 -##眺 -##眼 -##眾 -##着 -##睁 -##睇 -##睏 -##睐 -##睑 -##睛 -##睜 -##睞 -##睡 -##睢 -##督 -##睥 -##睦 -##睨 -##睪 -##睫 -##睬 -##睹 -##睽 -##睾 -##睿 -##瞄 -##瞅 -##瞇 -##瞋 -##瞌 -##瞎 -##瞑 -##瞒 -##瞓 -##瞞 -##瞟 -##瞠 -##瞥 -##瞧 -##瞩 -##瞪 -##瞬 -##瞭 -##瞰 -##瞳 -##瞻 -##瞼 -##瞿 -##矇 -##矍 -##矗 -##矚 -##矛 -##矜 -##矢 -##矣 -##知 -##矩 -##矫 -##短 -##矮 -##矯 -##石 -##矶 -##矽 -##矾 -##矿 -##码 -##砂 -##砌 -##砍 -##砒 -##研 -##砖 -##砗 -##砚 -##砝 -##砣 -##砥 -##砧 -##砭 -##砰 -##砲 -##破 -##砷 -##砸 -##砺 -##砼 -##砾 -##础 -##硅 -##硐 -##硒 -##硕 -##硝 -##硫 -##硬 -##确 -##硯 -##硼 -##碁 -##碇 -##碉 -##碌 -##碍 -##碎 -##碑 -##碓 -##碗 -##碘 -##碚 -##碛 -##碟 -##碣 -##碧 -##碩 -##碰 -##碱 -##碳 -##碴 -##確 -##碼 -##碾 -##磁 -##磅 -##磊 -##磋 -##磐 -##磕 -##磚 -##磡 -##磨 -##磬 -##磯 -##磲 -##磷 -##磺 -##礁 -##礎 -##礙 -##礡 -##礦 -##礪 -##礫 -##礴 -##示 -##礼 -##社 -##祀 -##祁 -##祂 -##祇 -##祈 -##祉 -##祎 -##祐 -##祕 -##祖 -##祗 -##祚 -##祛 -##祜 -##祝 -##神 -##祟 -##祠 -##祢 -##祥 -##票 -##祭 -##祯 -##祷 -##祸 -##祺 -##祿 -##禀 -##禁 -##禄 -##禅 -##禍 -##禎 -##福 -##禛 -##禦 -##禧 -##禪 -##禮 -##禱 -##禹 -##禺 -##离 -##禽 -##禾 -##禿 -##秀 -##私 -##秃 -##秆 -##秉 -##秋 -##种 -##科 -##秒 -##秘 -##租 -##秣 -##秤 -##秦 -##秧 -##秩 -##秭 -##积 -##称 -##秸 -##移 -##秽 -##稀 -##稅 -##程 -##稍 -##税 -##稔 -##稗 -##稚 -##稜 -##稞 -##稟 -##稠 -##稣 -##種 -##稱 -##稲 -##稳 -##稷 -##稹 -##稻 -##稼 -##稽 -##稿 -##穀 -##穂 -##穆 -##穌 -##積 -##穎 -##穗 -##穢 -##穩 -##穫 -##穴 -##究 -##穷 -##穹 -##空 -##穿 -##突 -##窃 -##窄 -##窈 -##窍 -##窑 -##窒 -##窓 -##窕 -##窖 -##窗 -##窘 -##窜 -##窝 -##窟 -##窠 -##窥 -##窦 -##窨 -##窩 -##窪 -##窮 -##窯 -##窺 -##窿 -##竄 -##竅 -##竇 -##竊 -##立 -##竖 -##站 -##竜 -##竞 -##竟 -##章 -##竣 -##童 -##竭 -##端 -##競 -##竹 -##竺 -##竽 -##竿 -##笃 -##笆 -##笈 -##笋 -##笏 -##笑 -##笔 -##笙 -##笛 -##笞 -##笠 -##符 -##笨 -##第 -##笹 -##笺 -##笼 -##筆 -##等 -##筊 -##筋 -##筍 -##筏 -##筐 -##筑 -##筒 -##答 -##策 -##筛 -##筝 -##筠 -##筱 -##筲 -##筵 -##筷 -##筹 -##签 -##简 -##箇 -##箋 -##箍 -##箏 -##箐 -##箔 -##箕 -##算 -##箝 -##管 -##箩 -##箫 -##箭 -##箱 -##箴 -##箸 -##節 -##篁 -##範 -##篆 -##篇 -##築 -##篑 -##篓 -##篙 -##篝 -##篠 -##篡 -##篤 -##篩 -##篪 -##篮 -##篱 -##篷 -##簇 -##簌 -##簍 -##簡 -##簦 -##簧 -##簪 -##簫 -##簷 -##簸 -##簽 -##簾 -##簿 -##籁 -##籃 -##籌 -##籍 -##籐 -##籟 -##籠 -##籤 -##籬 -##籮 -##籲 -##米 -##类 -##籼 -##籽 -##粄 -##粉 -##粑 -##粒 -##粕 -##粗 -##粘 -##粟 -##粤 -##粥 -##粧 -##粪 -##粮 -##粱 -##粲 -##粳 -##粵 -##粹 -##粼 -##粽 -##精 -##粿 -##糅 -##糊 -##糍 -##糕 -##糖 -##糗 -##糙 -##糜 -##糞 -##糟 -##糠 -##糧 -##糬 -##糯 -##糰 -##糸 -##系 -##糾 -##紀 -##紂 -##約 -##紅 -##紉 -##紊 -##紋 -##納 -##紐 -##紓 -##純 -##紗 -##紘 -##紙 -##級 -##紛 -##紜 -##素 -##紡 -##索 -##紧 -##紫 -##紮 -##累 -##細 -##紳 -##紹 -##紺 -##終 -##絃 -##組 -##絆 -##経 -##結 -##絕 -##絞 -##絡 -##絢 -##給 -##絨 -##絮 -##統 -##絲 -##絳 -##絵 -##絶 -##絹 -##綁 -##綏 -##綑 -##經 -##継 -##続 -##綜 -##綠 -##綢 -##綦 -##綫 -##綬 -##維 -##綱 -##網 -##綴 -##綵 -##綸 -##綺 -##綻 -##綽 -##綾 -##綿 -##緊 -##緋 -##総 -##緑 -##緒 -##緘 -##線 -##緝 -##緞 -##締 -##緣 -##編 -##緩 -##緬 -##緯 -##練 -##緹 -##緻 -##縁 -##縄 -##縈 -##縛 -##縝 -##縣 -##縫 -##縮 -##縱 -##縴 -##縷 -##總 -##績 -##繁 -##繃 -##繆 -##繇 -##繋 -##織 -##繕 -##繚 -##繞 -##繡 -##繩 -##繪 -##繫 -##繭 -##繳 -##繹 -##繼 -##繽 -##纂 -##續 -##纍 -##纏 -##纓 -##纔 -##纖 -##纜 -##纠 -##红 -##纣 -##纤 -##约 -##级 -##纨 -##纪 -##纫 -##纬 -##纭 -##纯 -##纰 -##纱 -##纲 -##纳 -##纵 -##纶 -##纷 -##纸 -##纹 -##纺 -##纽 -##纾 -##线 -##绀 -##练 -##组 -##绅 -##细 -##织 -##终 -##绊 -##绍 -##绎 -##经 -##绑 -##绒 -##结 -##绔 -##绕 -##绘 -##给 -##绚 -##绛 -##络 -##绝 -##绞 -##统 -##绡 -##绢 -##绣 -##绥 -##绦 -##继 -##绩 -##绪 -##绫 -##续 -##绮 -##绯 -##绰 -##绳 -##维 -##绵 -##绶 -##绷 -##绸 -##绻 -##综 -##绽 -##绾 -##绿 -##缀 -##缄 -##缅 -##缆 -##缇 -##缈 -##缉 -##缎 -##缓 -##缔 -##缕 -##编 -##缘 -##缙 -##缚 -##缜 -##缝 -##缠 -##缢 -##缤 -##缥 -##缨 -##缩 -##缪 -##缭 -##缮 -##缰 -##缱 -##缴 -##缸 -##缺 -##缽 -##罂 -##罄 -##罌 -##罐 -##网 -##罔 -##罕 -##罗 -##罚 -##罡 -##罢 -##罩 -##罪 -##置 -##罰 -##署 -##罵 -##罷 -##罹 -##羁 -##羅 -##羈 -##羊 -##羌 -##美 -##羔 -##羚 -##羞 -##羟 -##羡 -##羣 -##群 -##羥 -##羧 -##羨 -##義 -##羯 -##羲 -##羸 -##羹 -##羽 -##羿 -##翁 -##翅 -##翊 -##翌 -##翎 -##習 -##翔 -##翘 -##翟 -##翠 -##翡 -##翦 -##翩 -##翰 -##翱 -##翳 -##翹 -##翻 -##翼 -##耀 -##老 -##考 -##耄 -##者 -##耆 -##耋 -##而 -##耍 -##耐 -##耒 -##耕 -##耗 -##耘 -##耙 -##耦 -##耨 -##耳 -##耶 -##耷 -##耸 -##耻 -##耽 -##耿 -##聂 -##聆 -##聊 -##聋 -##职 -##聒 -##联 -##聖 -##聘 -##聚 -##聞 -##聪 -##聯 -##聰 -##聲 -##聳 -##聴 -##聶 -##職 -##聽 -##聾 -##聿 -##肃 -##肄 -##肅 -##肆 -##肇 -##肉 -##肋 -##肌 -##肏 -##肓 -##肖 -##肘 -##肚 -##肛 -##肝 -##肠 -##股 -##肢 -##肤 -##肥 -##肩 -##肪 -##肮 -##肯 -##肱 -##育 -##肴 -##肺 -##肽 -##肾 -##肿 -##胀 -##胁 -##胃 -##胄 -##胆 -##背 -##胍 -##胎 -##胖 -##胚 -##胛 -##胜 -##胝 -##胞 -##胡 -##胤 -##胥 -##胧 -##胫 -##胭 -##胯 -##胰 -##胱 -##胳 -##胴 -##胶 -##胸 -##胺 -##能 -##脂 -##脅 -##脆 -##脇 -##脈 -##脉 -##脊 -##脍 -##脏 -##脐 -##脑 -##脓 -##脖 -##脘 -##脚 -##脛 -##脣 -##脩 -##脫 -##脯 -##脱 -##脲 -##脳 -##脸 -##脹 -##脾 -##腆 -##腈 -##腊 -##腋 -##腌 -##腎 -##腐 -##腑 -##腓 -##腔 -##腕 -##腥 -##腦 -##腩 -##腫 -##腭 -##腮 -##腰 -##腱 -##腳 -##腴 -##腸 -##腹 -##腺 -##腻 -##腼 -##腾 -##腿 -##膀 -##膈 -##膊 -##膏 -##膑 -##膘 -##膚 -##膛 -##膜 -##膝 -##膠 -##膦 -##膨 -##膩 -##膳 -##膺 -##膻 -##膽 -##膾 -##膿 -##臀 -##臂 -##臃 -##臆 -##臉 -##臊 -##臍 -##臓 -##臘 -##臟 -##臣 -##臥 -##臧 -##臨 -##自 -##臬 -##臭 -##至 -##致 -##臺 -##臻 -##臼 -##臾 -##舀 -##舂 -##舅 -##舆 -##與 -##興 -##舉 -##舊 -##舌 -##舍 -##舎 -##舐 -##舒 -##舔 -##舖 -##舗 -##舛 -##舜 -##舞 -##舟 -##航 -##舫 -##般 -##舰 -##舱 -##舵 -##舶 -##舷 -##舸 -##船 -##舺 -##舾 -##艇 -##艋 -##艘 -##艙 -##艦 -##艮 -##良 -##艰 -##艱 -##色 -##艳 -##艷 -##艹 -##艺 -##艾 -##节 -##芃 -##芈 -##芊 -##芋 -##芍 -##芎 -##芒 -##芙 -##芜 -##芝 -##芡 -##芥 -##芦 -##芩 -##芪 -##芫 -##芬 -##芭 -##芮 -##芯 -##花 -##芳 -##芷 -##芸 -##芹 -##芻 -##芽 -##芾 -##苁 -##苄 -##苇 -##苋 -##苍 -##苏 -##苑 -##苒 -##苓 -##苔 -##苕 -##苗 -##苛 -##苜 -##苞 -##苟 -##苡 -##苣 -##若 -##苦 -##苫 -##苯 -##英 -##苷 -##苹 -##苻 -##茁 -##茂 -##范 -##茄 -##茅 -##茉 -##茎 -##茏 -##茗 -##茜 -##茧 -##茨 -##茫 -##茬 -##茭 -##茯 -##茱 -##茲 -##茴 -##茵 -##茶 -##茸 -##茹 -##茼 -##荀 -##荃 -##荆 -##草 -##荊 -##荏 -##荐 -##荒 -##荔 -##荖 -##荘 -##荚 -##荞 -##荟 -##荠 -##荡 -##荣 -##荤 -##荥 -##荧 -##荨 -##荪 -##荫 -##药 -##荳 -##荷 -##荸 -##荻 -##荼 -##荽 -##莅 -##莆 -##莉 -##莊 -##莎 -##莒 -##莓 -##莖 -##莘 -##莞 -##莠 -##莢 -##莧 -##莪 -##莫 -##莱 -##莲 -##莴 -##获 -##莹 -##莺 -##莽 -##莿 -##菀 -##菁 -##菅 -##菇 -##菈 -##菊 -##菌 -##菏 -##菓 -##菖 -##菘 -##菜 -##菟 -##菠 -##菡 -##菩 -##華 -##菱 -##菲 -##菸 -##菽 -##萁 -##萃 -##萄 -##萊 -##萋 -##萌 -##萍 -##萎 -##萘 -##萝 -##萤 -##营 -##萦 -##萧 -##萨 -##萩 -##萬 -##萱 -##萵 -##萸 -##萼 -##落 -##葆 -##葉 -##著 -##葚 -##葛 -##葡 -##董 -##葦 -##葩 -##葫 -##葬 -##葭 -##葯 -##葱 -##葳 -##葵 -##葷 -##葺 -##蒂 -##蒋 -##蒐 -##蒔 -##蒙 -##蒜 -##蒞 -##蒟 -##蒡 -##蒨 -##蒲 -##蒸 -##蒹 -##蒻 -##蒼 -##蒿 -##蓁 -##蓄 -##蓆 -##蓉 -##蓋 -##蓑 -##蓓 -##蓖 -##蓝 -##蓟 -##蓦 -##蓬 -##蓮 -##蓼 -##蓿 -##蔑 -##蔓 -##蔔 -##蔗 -##蔘 -##蔚 -##蔡 -##蔣 -##蔥 -##蔫 -##蔬 -##蔭 -##蔵 -##蔷 -##蔺 -##蔻 -##蔼 -##蔽 -##蕁 -##蕃 -##蕈 -##蕉 -##蕊 -##蕎 -##蕙 -##蕤 -##蕨 -##蕩 -##蕪 -##蕭 -##蕲 -##蕴 -##蕻 -##蕾 -##薄 -##薅 -##薇 -##薈 -##薊 -##薏 -##薑 -##薔 -##薙 -##薛 -##薦 -##薨 -##薩 -##薪 -##薬 -##薯 -##薰 -##薹 -##藉 -##藍 -##藏 -##藐 -##藓 -##藕 -##藜 -##藝 -##藤 -##藥 -##藩 -##藹 -##藻 -##藿 -##蘆 -##蘇 -##蘊 -##蘋 -##蘑 -##蘚 -##蘭 -##蘸 -##蘼 -##蘿 -##虎 -##虏 -##虐 -##虑 -##虔 -##處 -##虚 -##虛 -##虜 -##虞 -##號 -##虢 -##虧 -##虫 -##虬 -##虱 -##虹 -##虻 -##虽 -##虾 -##蚀 -##蚁 -##蚂 -##蚊 -##蚌 -##蚓 -##蚕 -##蚜 -##蚝 -##蚣 -##蚤 -##蚩 -##蚪 -##蚯 -##蚱 -##蚵 -##蛀 -##蛆 -##蛇 -##蛊 -##蛋 -##蛎 -##蛐 -##蛔 -##蛙 -##蛛 -##蛟 -##蛤 -##蛭 -##蛮 -##蛰 -##蛳 -##蛹 -##蛻 -##蛾 -##蜀 -##蜂 -##蜃 -##蜆 -##蜇 -##蜈 -##蜊 -##蜍 -##蜒 -##蜓 -##蜕 -##蜗 -##蜘 -##蜚 -##蜜 -##蜡 -##蜢 -##蜥 -##蜱 -##蜴 -##蜷 -##蜻 -##蜿 -##蝇 -##蝈 -##蝉 -##蝌 -##蝎 -##蝕 -##蝗 -##蝙 -##蝟 -##蝠 -##蝦 -##蝨 -##蝴 -##蝶 -##蝸 -##蝼 -##螂 -##螃 -##融 -##螞 -##螢 -##螨 -##螯 -##螳 -##螺 -##蟀 -##蟄 -##蟆 -##蟋 -##蟎 -##蟑 -##蟒 -##蟠 -##蟬 -##蟲 -##蟹 -##蟻 -##蟾 -##蠅 -##蠍 -##蠔 -##蠕 -##蠛 -##蠟 -##蠡 -##蠢 -##蠣 -##蠱 -##蠶 -##蠹 -##蠻 -##血 -##衄 -##衅 -##衆 -##行 -##衍 -##術 -##衔 -##街 -##衙 -##衛 -##衝 -##衞 -##衡 -##衢 -##衣 -##补 -##表 -##衩 -##衫 -##衬 -##衮 -##衰 -##衲 -##衷 -##衹 -##衾 -##衿 -##袁 -##袂 -##袄 -##袅 -##袈 -##袋 -##袍 -##袒 -##袖 -##袜 -##袞 -##袤 -##袪 -##被 -##袭 -##袱 -##裁 -##裂 -##装 -##裆 -##裊 -##裏 -##裔 -##裕 -##裘 -##裙 -##補 -##裝 -##裟 -##裡 -##裤 -##裨 -##裱 -##裳 -##裴 -##裸 -##裹 -##製 -##裾 -##褂 -##複 -##褐 -##褒 -##褓 -##褔 -##褚 -##褥 -##褪 -##褫 -##褲 -##褶 -##褻 -##襁 -##襄 -##襟 -##襠 -##襪 -##襬 -##襯 -##襲 -##西 -##要 -##覃 -##覆 -##覇 -##見 -##規 -##覓 -##視 -##覚 -##覦 -##覧 -##親 -##覬 -##観 -##覷 -##覺 -##覽 -##觀 -##见 -##观 -##规 -##觅 -##视 -##览 -##觉 -##觊 -##觎 -##觐 -##觑 -##角 -##觞 -##解 -##觥 -##触 -##觸 -##言 -##訂 -##計 -##訊 -##討 -##訓 -##訕 -##訖 -##託 -##記 -##訛 -##訝 -##訟 -##訣 -##訥 -##訪 -##設 -##許 -##訳 -##訴 -##訶 -##診 -##註 -##証 -##詆 -##詐 -##詔 -##評 -##詛 -##詞 -##詠 -##詡 -##詢 -##詣 -##試 -##詩 -##詫 -##詬 -##詭 -##詮 -##詰 -##話 -##該 -##詳 -##詹 -##詼 -##誅 -##誇 -##誉 -##誌 -##認 -##誓 -##誕 -##誘 -##語 -##誠 -##誡 -##誣 -##誤 -##誥 -##誦 -##誨 -##說 -##説 -##読 -##誰 -##課 -##誹 -##誼 -##調 -##諄 -##談 -##請 -##諏 -##諒 -##論 -##諗 -##諜 -##諡 -##諦 -##諧 -##諫 -##諭 -##諮 -##諱 -##諳 -##諷 -##諸 -##諺 -##諾 -##謀 -##謁 -##謂 -##謄 -##謊 -##謎 -##謐 -##謔 -##謗 -##謙 -##講 -##謝 -##謠 -##謨 -##謬 -##謹 -##謾 -##譁 -##證 -##譎 -##譏 -##識 -##譙 -##譚 -##譜 -##警 -##譬 -##譯 -##議 -##譲 -##譴 -##護 -##譽 -##讀 -##變 -##讓 -##讚 -##讞 -##计 -##订 -##认 -##讥 -##讧 -##讨 -##让 -##讪 -##讫 -##训 -##议 -##讯 -##记 -##讲 -##讳 -##讴 -##讶 -##讷 -##许 -##讹 -##论 -##讼 -##讽 -##设 -##访 -##诀 -##证 -##诃 -##评 -##诅 -##识 -##诈 -##诉 -##诊 -##诋 -##词 -##诏 -##译 -##试 -##诗 -##诘 -##诙 -##诚 -##诛 -##话 -##诞 -##诟 -##诠 -##诡 -##询 -##诣 -##诤 -##该 -##详 -##诧 -##诩 -##诫 -##诬 -##语 -##误 -##诰 -##诱 -##诲 -##说 -##诵 -##诶 -##请 -##诸 -##诺 -##读 -##诽 -##课 -##诿 -##谀 -##谁 -##调 -##谄 -##谅 -##谆 -##谈 -##谊 -##谋 -##谌 -##谍 -##谎 -##谏 -##谐 -##谑 -##谒 -##谓 -##谔 -##谕 -##谗 -##谘 -##谙 -##谚 -##谛 -##谜 -##谟 -##谢 -##谣 -##谤 -##谥 -##谦 -##谧 -##谨 -##谩 -##谪 -##谬 -##谭 -##谯 -##谱 -##谲 -##谴 -##谶 -##谷 -##豁 -##豆 -##豇 -##豈 -##豉 -##豊 -##豌 -##豎 -##豐 -##豔 -##豚 -##象 -##豢 -##豪 -##豫 -##豬 -##豹 -##豺 -##貂 -##貅 -##貌 -##貓 -##貔 -##貘 -##貝 -##貞 -##負 -##財 -##貢 -##貧 -##貨 -##販 -##貪 -##貫 -##責 -##貯 -##貰 -##貳 -##貴 -##貶 -##買 -##貸 -##費 -##貼 -##貽 -##貿 -##賀 -##賁 -##賂 -##賃 -##賄 -##資 -##賈 -##賊 -##賑 -##賓 -##賜 -##賞 -##賠 -##賡 -##賢 -##賣 -##賤 -##賦 -##質 -##賬 -##賭 -##賴 -##賺 -##購 -##賽 -##贅 -##贈 -##贊 -##贍 -##贏 -##贓 -##贖 -##贛 -##贝 -##贞 -##负 -##贡 -##财 -##责 -##贤 -##败 -##账 -##货 -##质 -##贩 -##贪 -##贫 -##贬 -##购 -##贮 -##贯 -##贰 -##贱 -##贲 -##贴 -##贵 -##贷 -##贸 -##费 -##贺 -##贻 -##贼 -##贾 -##贿 -##赁 -##赂 -##赃 -##资 -##赅 -##赈 -##赊 -##赋 -##赌 -##赎 -##赏 -##赐 -##赓 -##赔 -##赖 -##赘 -##赚 -##赛 -##赝 -##赞 -##赠 -##赡 -##赢 -##赣 -##赤 -##赦 -##赧 -##赫 -##赭 -##走 -##赳 -##赴 -##赵 -##赶 -##起 -##趁 -##超 -##越 -##趋 -##趕 -##趙 -##趟 -##趣 -##趨 -##足 -##趴 -##趵 -##趸 -##趺 -##趾 -##跃 -##跄 -##跆 -##跋 -##跌 -##跎 -##跑 -##跖 -##跚 -##跛 -##距 -##跟 -##跡 -##跤 -##跨 -##跩 -##跪 -##路 -##跳 -##践 -##跷 -##跹 -##跺 -##跻 -##踉 -##踊 -##踌 -##踏 -##踐 -##踝 -##踞 -##踟 -##踢 -##踩 -##踪 -##踮 -##踱 -##踴 -##踵 -##踹 -##蹂 -##蹄 -##蹇 -##蹈 -##蹉 -##蹊 -##蹋 -##蹑 -##蹒 -##蹙 -##蹟 -##蹣 -##蹤 -##蹦 -##蹩 -##蹬 -##蹭 -##蹲 -##蹴 -##蹶 -##蹺 -##蹼 -##蹿 -##躁 -##躇 -##躉 -##躊 -##躋 -##躍 -##躏 -##躪 -##身 -##躬 -##躯 -##躲 -##躺 -##軀 -##車 -##軋 -##軌 -##軍 -##軒 -##軟 -##転 -##軸 -##軼 -##軽 -##軾 -##較 -##載 -##輒 -##輓 -##輔 -##輕 -##輛 -##輝 -##輟 -##輩 -##輪 -##輯 -##輸 -##輻 -##輾 -##輿 -##轄 -##轅 -##轆 -##轉 -##轍 -##轎 -##轟 -##车 -##轧 -##轨 -##轩 -##转 -##轭 -##轮 -##软 -##轰 -##轲 -##轴 -##轶 -##轻 -##轼 -##载 -##轿 -##较 -##辄 -##辅 -##辆 -##辇 -##辈 -##辉 -##辊 -##辍 -##辐 -##辑 -##输 -##辕 -##辖 -##辗 -##辘 -##辙 -##辛 -##辜 -##辞 -##辟 -##辣 -##辦 -##辨 -##辩 -##辫 -##辭 -##辮 -##辯 -##辰 -##辱 -##農 -##边 -##辺 -##辻 -##込 -##辽 -##达 -##迁 -##迂 -##迄 -##迅 -##过 -##迈 -##迎 -##运 -##近 -##返 -##还 -##这 -##进 -##远 -##违 -##连 -##迟 -##迢 -##迤 -##迥 -##迦 -##迩 -##迪 -##迫 -##迭 -##述 -##迴 -##迷 -##迸 -##迹 -##迺 -##追 -##退 -##送 -##适 -##逃 -##逅 -##逆 -##选 -##逊 -##逍 -##透 -##逐 -##递 -##途 -##逕 -##逗 -##這 -##通 -##逛 -##逝 -##逞 -##速 -##造 -##逢 -##連 -##逮 -##週 -##進 -##逵 -##逶 -##逸 -##逻 -##逼 -##逾 -##遁 -##遂 -##遅 -##遇 -##遊 -##運 -##遍 -##過 -##遏 -##遐 -##遑 -##遒 -##道 -##達 -##違 -##遗 -##遙 -##遛 -##遜 -##遞 -##遠 -##遢 -##遣 -##遥 -##遨 -##適 -##遭 -##遮 -##遲 -##遴 -##遵 -##遶 -##遷 -##選 -##遺 -##遼 -##遽 -##避 -##邀 -##邁 -##邂 -##邃 -##還 -##邇 -##邈 -##邊 -##邋 -##邏 -##邑 -##邓 -##邕 -##邛 -##邝 -##邢 -##那 -##邦 -##邨 -##邪 -##邬 -##邮 -##邯 -##邰 -##邱 -##邳 -##邵 -##邸 -##邹 -##邺 -##邻 -##郁 -##郅 -##郊 -##郎 -##郑 -##郜 -##郝 -##郡 -##郢 -##郤 -##郦 -##郧 -##部 -##郫 -##郭 -##郴 -##郵 -##郷 -##郸 -##都 -##鄂 -##鄉 -##鄒 -##鄔 -##鄙 -##鄞 -##鄢 -##鄧 -##鄭 -##鄰 -##鄱 -##鄲 -##鄺 -##酉 -##酊 -##酋 -##酌 -##配 -##酐 -##酒 -##酗 -##酚 -##酝 -##酢 -##酣 -##酥 -##酩 -##酪 -##酬 -##酮 -##酯 -##酰 -##酱 -##酵 -##酶 -##酷 -##酸 -##酿 -##醃 -##醇 -##醉 -##醋 -##醍 -##醐 -##醒 -##醚 -##醛 -##醜 -##醞 -##醣 -##醪 -##醫 -##醬 -##醮 -##醯 -##醴 -##醺 -##釀 -##釁 -##采 -##釉 -##释 -##釋 -##里 -##重 -##野 -##量 -##釐 -##金 -##釗 -##釘 -##釜 -##針 -##釣 -##釦 -##釧 -##釵 -##鈀 -##鈉 -##鈍 -##鈎 -##鈔 -##鈕 -##鈞 -##鈣 -##鈦 -##鈪 -##鈴 -##鈺 -##鈾 -##鉀 -##鉄 -##鉅 -##鉉 -##鉑 -##鉗 -##鉚 -##鉛 -##鉤 -##鉴 -##鉻 -##銀 -##銃 -##銅 -##銑 -##銓 -##銖 -##銘 -##銜 -##銬 -##銭 -##銮 -##銳 -##銷 -##銹 -##鋁 -##鋅 -##鋒 -##鋤 -##鋪 -##鋰 -##鋸 -##鋼 -##錄 -##錐 -##錘 -##錚 -##錠 -##錢 -##錦 -##錨 -##錫 -##錮 -##錯 -##録 -##錳 -##錶 -##鍊 -##鍋 -##鍍 -##鍛 -##鍥 -##鍰 -##鍵 -##鍺 -##鍾 -##鎂 -##鎊 -##鎌 -##鎏 -##鎔 -##鎖 -##鎗 -##鎚 -##鎧 -##鎬 -##鎮 -##鎳 -##鏈 -##鏖 -##鏗 -##鏘 -##鏞 -##鏟 -##鏡 -##鏢 -##鏤 -##鏽 -##鐘 -##鐮 -##鐲 -##鐳 -##鐵 -##鐸 -##鐺 -##鑄 -##鑊 -##鑑 -##鑒 -##鑣 -##鑫 -##鑰 -##鑲 -##鑼 -##鑽 -##鑾 -##鑿 -##针 -##钉 -##钊 -##钎 -##钏 -##钒 -##钓 -##钗 -##钙 -##钛 -##钜 -##钝 -##钞 -##钟 -##钠 -##钡 -##钢 -##钣 -##钤 -##钥 -##钦 -##钧 -##钨 -##钩 -##钮 -##钯 -##钰 -##钱 -##钳 -##钴 -##钵 -##钺 -##钻 -##钼 -##钾 -##钿 -##铀 -##铁 -##铂 -##铃 -##铄 -##铅 -##铆 -##铉 -##铎 -##铐 -##铛 -##铜 -##铝 -##铠 -##铡 -##铢 -##铣 -##铤 -##铨 -##铩 -##铬 -##铭 -##铮 -##铰 -##铲 -##铵 -##银 -##铸 -##铺 -##链 -##铿 -##销 -##锁 -##锂 -##锄 -##锅 -##锆 -##锈 -##锉 -##锋 -##锌 -##锏 -##锐 -##锑 -##错 -##锚 -##锟 -##锡 -##锢 -##锣 -##锤 -##锥 -##锦 -##锭 -##键 -##锯 -##锰 -##锲 -##锵 -##锹 -##锺 -##锻 -##镀 -##镁 -##镂 -##镇 -##镉 -##镌 -##镍 -##镐 -##镑 -##镕 -##镖 -##镗 -##镛 -##镜 -##镣 -##镭 -##镯 -##镰 -##镳 -##镶 -##長 -##长 -##門 -##閃 -##閉 -##開 -##閎 -##閏 -##閑 -##閒 -##間 -##閔 -##閘 -##閡 -##関 -##閣 -##閥 -##閨 -##閩 -##閱 -##閲 -##閹 -##閻 -##閾 -##闆 -##闇 -##闊 -##闌 -##闍 -##闔 -##闕 -##闖 -##闘 -##關 -##闡 -##闢 -##门 -##闪 -##闫 -##闭 -##问 -##闯 -##闰 -##闲 -##间 -##闵 -##闷 -##闸 -##闹 -##闺 -##闻 -##闽 -##闾 -##阀 -##阁 -##阂 -##阅 -##阆 -##阇 -##阈 -##阉 -##阎 -##阐 -##阑 -##阔 -##阕 -##阖 -##阙 -##阚 -##阜 -##队 -##阡 -##阪 -##阮 -##阱 -##防 -##阳 -##阴 -##阵 -##阶 -##阻 -##阿 -##陀 -##陂 -##附 -##际 -##陆 -##陇 -##陈 -##陋 -##陌 -##降 -##限 -##陕 -##陛 -##陝 -##陞 -##陟 -##陡 -##院 -##陣 -##除 -##陨 -##险 -##陪 -##陰 -##陲 -##陳 -##陵 -##陶 -##陷 -##陸 -##険 -##陽 -##隅 -##隆 -##隈 -##隊 -##隋 -##隍 -##階 -##随 -##隐 -##隔 -##隕 -##隘 -##隙 -##際 -##障 -##隠 -##隣 -##隧 -##隨 -##險 -##隱 -##隴 -##隶 -##隸 -##隻 -##隼 -##隽 -##难 -##雀 -##雁 -##雄 -##雅 -##集 -##雇 -##雉 -##雋 -##雌 -##雍 -##雎 -##雏 -##雑 -##雒 -##雕 -##雖 -##雙 -##雛 -##雜 -##雞 -##離 -##難 -##雨 -##雪 -##雯 -##雰 -##雲 -##雳 -##零 -##雷 -##雹 -##電 -##雾 -##需 -##霁 -##霄 -##霆 -##震 -##霈 -##霉 -##霊 -##霍 -##霎 -##霏 -##霑 -##霓 -##霖 -##霜 -##霞 -##霧 -##霭 -##霰 -##露 -##霸 -##霹 -##霽 -##霾 -##靂 -##靄 -##靈 -##青 -##靓 -##靖 -##静 -##靚 -##靛 -##靜 -##非 -##靠 -##靡 -##面 -##靥 -##靦 -##革 -##靳 -##靴 -##靶 -##靼 -##鞅 -##鞋 -##鞍 -##鞏 -##鞑 -##鞘 -##鞠 -##鞣 -##鞦 -##鞭 -##韆 -##韋 -##韌 -##韓 -##韜 -##韦 -##韧 -##韩 -##韬 -##韭 -##音 -##韵 -##韶 -##韻 -##響 -##頁 -##頂 -##頃 -##項 -##順 -##須 -##頌 -##預 -##頑 -##頒 -##頓 -##頗 -##領 -##頜 -##頡 -##頤 -##頫 -##頭 -##頰 -##頷 -##頸 -##頹 -##頻 -##頼 -##顆 -##題 -##額 -##顎 -##顏 -##顔 -##願 -##顛 -##類 -##顧 -##顫 -##顯 -##顱 -##顴 -##页 -##顶 -##顷 -##项 -##顺 -##须 -##顼 -##顽 -##顾 -##顿 -##颁 -##颂 -##预 -##颅 -##领 -##颇 -##颈 -##颉 -##颊 -##颌 -##颍 -##颐 -##频 -##颓 -##颔 -##颖 -##颗 -##题 -##颚 -##颛 -##颜 -##额 -##颞 -##颠 -##颡 -##颢 -##颤 -##颦 -##颧 -##風 -##颯 -##颱 -##颳 -##颶 -##颼 -##飄 -##飆 -##风 -##飒 -##飓 -##飕 -##飘 -##飙 -##飚 -##飛 -##飞 -##食 -##飢 -##飨 -##飩 -##飪 -##飯 -##飲 -##飼 -##飽 -##飾 -##餃 -##餅 -##餉 -##養 -##餌 -##餐 -##餒 -##餓 -##餘 -##餚 -##餛 -##餞 -##餡 -##館 -##餮 -##餵 -##餾 -##饅 -##饈 -##饋 -##饌 -##饍 -##饑 -##饒 -##饕 -##饗 -##饞 -##饥 -##饨 -##饪 -##饬 -##饭 -##饮 -##饯 -##饰 -##饱 -##饲 -##饴 -##饵 -##饶 -##饷 -##饺 -##饼 -##饽 -##饿 -##馀 -##馁 -##馄 -##馅 -##馆 -##馈 -##馋 -##馍 -##馏 -##馒 -##馔 -##首 -##馗 -##香 -##馥 -##馨 -##馬 -##馭 -##馮 -##馳 -##馴 -##駁 -##駄 -##駅 -##駆 -##駐 -##駒 -##駕 -##駛 -##駝 -##駭 -##駱 -##駿 -##騁 -##騎 -##騏 -##験 -##騙 -##騨 -##騰 -##騷 -##驀 -##驅 -##驊 -##驍 -##驒 -##驕 -##驗 -##驚 -##驛 -##驟 -##驢 -##驥 -##马 -##驭 -##驮 -##驯 -##驰 -##驱 -##驳 -##驴 -##驶 -##驷 -##驸 -##驹 -##驻 -##驼 -##驾 -##驿 -##骁 -##骂 -##骄 -##骅 -##骆 -##骇 -##骈 -##骊 -##骋 -##验 -##骏 -##骐 -##骑 -##骗 -##骚 -##骛 -##骜 -##骞 -##骠 -##骡 -##骤 -##骥 -##骧 -##骨 -##骯 -##骰 -##骶 -##骷 -##骸 -##骼 -##髂 -##髅 -##髋 -##髏 -##髒 -##髓 -##體 -##髖 -##高 -##髦 -##髪 -##髮 -##髯 -##髻 -##鬃 -##鬆 -##鬍 -##鬓 -##鬚 -##鬟 -##鬢 -##鬣 -##鬥 -##鬧 -##鬱 -##鬼 -##魁 -##魂 -##魄 -##魅 -##魇 -##魍 -##魏 -##魔 -##魘 -##魚 -##魯 -##魷 -##鮑 -##鮨 -##鮪 -##鮭 -##鮮 -##鯉 -##鯊 -##鯖 -##鯛 -##鯨 -##鯰 -##鯽 -##鰍 -##鰓 -##鰭 -##鰲 -##鰻 -##鰾 -##鱈 -##鱉 -##鱔 -##鱗 -##鱷 -##鱸 -##鱼 -##鱿 -##鲁 -##鲈 -##鲍 -##鲑 -##鲛 -##鲜 -##鲟 -##鲢 -##鲤 -##鲨 -##鲫 -##鲱 -##鲲 -##鲶 -##鲷 -##鲸 -##鳃 -##鳄 -##鳅 -##鳌 -##鳍 -##鳕 -##鳖 -##鳗 -##鳝 -##鳞 -##鳥 -##鳩 -##鳳 -##鳴 -##鳶 -##鴉 -##鴕 -##鴛 -##鴦 -##鴨 -##鴻 -##鴿 -##鵑 -##鵜 -##鵝 -##鵡 -##鵬 -##鵰 -##鵲 -##鶘 -##鶩 -##鶯 -##鶴 -##鷗 -##鷲 -##鷹 -##鷺 -##鸚 -##鸞 -##鸟 -##鸠 -##鸡 -##鸢 -##鸣 -##鸥 -##鸦 -##鸨 -##鸪 -##鸭 -##鸯 -##鸳 -##鸵 -##鸽 -##鸾 -##鸿 -##鹂 -##鹃 -##鹄 -##鹅 -##鹈 -##鹉 -##鹊 -##鹌 -##鹏 -##鹑 -##鹕 -##鹘 -##鹜 -##鹞 -##鹤 -##鹦 -##鹧 -##鹫 -##鹭 -##鹰 -##鹳 -##鹵 -##鹹 -##鹼 -##鹽 -##鹿 -##麂 -##麋 -##麒 -##麓 -##麗 -##麝 -##麟 -##麥 -##麦 -##麩 -##麴 -##麵 -##麸 -##麺 -##麻 -##麼 -##麽 -##麾 -##黃 -##黄 -##黍 -##黎 -##黏 -##黑 -##黒 -##黔 -##默 -##黛 -##黜 -##黝 -##點 -##黠 -##黨 -##黯 -##黴 -##鼋 -##鼎 -##鼐 -##鼓 -##鼠 -##鼬 -##鼹 -##鼻 -##鼾 -##齁 -##齊 -##齋 -##齐 -##齒 -##齡 -##齢 -##齣 -##齦 -##齿 -##龄 -##龅 -##龈 -##龊 -##龋 -##龌 -##龍 -##龐 -##龔 -##龕 -##龙 -##龚 -##龛 -##龜 -##龟 -##︰ -##︱ -##︶ -##︿ -##﹁ -##﹂ -##﹍ -##﹏ -##﹐ -##﹑ -##﹒ -##﹔ -##﹕ -##﹖ -##﹗ -##﹙ -##﹚ -##﹝ -##﹞ -##﹡ -##﹣ -##! -##" -### -##$ -##% -##& -##' -##( -##) -##* -##, -##- -##. -##/ -##: -##; -##< -##? -##@ -##[ -##\ -##] -##^ -##_ -##` -##f -##h -##j -##u -##w -##z -##{ -##} -##。 -##「 -##」 -##、 -##・ -##ッ -##ー -##イ -##ク -##シ -##ス -##ト -##ノ -##フ -##ラ -##ル -##ン -##゙ -##゚ -## ̄ -##¥ -##👍 -##🔥 -##😂 -##😎 -[unused0] -[unused100] -[unused101] -[unused102] -[unused103] -[unused104] -[unused105] -[unused106] -[unused107] -[unused108] -[unused109] -[unused110] -[unused111] -[unused112] -[unused113] -[unused114] -[unused115] -[unused116] -[unused117] -[unused118] -[unused119] -[unused120] -[unused121] -[unused122] -[unused123] -[unused124] -[unused125] -[unused126] -[unused127] -[unused128] -[unused129] -[unused130] -[unused131] -[unused132] -[unused133] -[unused134] -[unused135] -[unused136] -[unused137] -[unused138] -[unused139] -[unused140] -[unused141] -[unused142] -[unused143] -[unused144] -[unused145] -[unused146] -[unused147] -[unused148] -[unused149] -[unused150] -[unused151] -[unused152] -[unused153] -[unused154] -[unused155] -[unused156] -[unused157] -[unused158] -[unused159] -[unused160] -[unused161] -[unused162] -[unused163] -[unused164] -[unused165] -[unused166] -[unused167] -[unused168] -[unused169] -[unused170] -[unused171] -[unused172] -[unused173] -[unused174] -[unused175] -[unused176] -[unused177] -[unused178] -[unused179] -[unused180] -[unused181] -[unused182] -[unused183] -[unused184] -[unused185] -[unused186] -[unused187] -[unused188] -[unused189] -[unused190] -[unused191] -[unused192] -[unused193] -[unused194] -[unused195] -[unused196] -[unused197] -[unused198] -[unused199] -[unused200] -[unused201] -[unused202] -[unused203] -[unused204] -[unused205] -[unused206] -[unused207] -[unused208] -[unused209] -[unused210] -[unused211] -[unused212] -[unused213] -[unused214] -[unused215] -[unused216] -[unused217] -[unused218] -[unused219] -[unused220] -[unused221] -[unused222] -[unused223] -[unused224] -[unused225] -[unused226] -[unused227] -[unused228] -[unused229] -[unused230] -[unused231] -[unused232] -[unused233] -[unused234] -[unused235] -[unused236] -[unused237] -[unused238] -[unused239] -[unused240] -[unused241] -[unused242] -[unused243] -[unused244] -[unused245] -[unused246] -[unused247] -[unused248] -[unused249] -[unused250] -[unused251] -[unused252] -[unused253] -[unused254] -[unused255] -[unused256] -[unused257] -[unused258] -[unused259] -[unused260] -[unused261] -[unused262] -[unused263] -[unused264] -[unused265] -[unused266] -[unused267] -[unused268] -[unused269] -[unused270] -[unused271] -[unused272] -[unused273] -[unused274] -[unused275] -[unused276] -[unused277] -[unused278] -[unused279] -[unused280] -[unused281] -[unused282] -[unused283] -[unused284] -[unused285] -[unused286] -[unused287] -[unused288] -[unused289] -[unused290] -[unused291] -[unused292] -[unused293] -[unused294] -[unused295] -[unused296] -[unused297] -[unused298] -[unused299] -[unused300] -[unused301] -[unused302] -[unused303] -[unused304] -[unused305] -[unused306] -[unused307] -[unused308] -[unused309] -[unused310] -[unused311] -[unused312] -[unused313] -[unused314] -[unused315] -[unused316] -[unused317] -[unused318] -[unused319] -[unused320] -[unused321] -[unused322] -[unused323] -[unused324] -[unused325] -[unused326] -[unused327] -[unused328] -[unused329] -[unused330] -[unused331] -[unused332] -[unused333] -[unused334] -[unused335] -[unused336] -[unused337] -[unused338] -[unused339] -[unused340] -[unused341] -[unused342] -[unused343] -[unused344] -[unused345] -[unused346] -[unused347] -[unused348] -[unused349] -[unused350] -[unused351] -[unused352] -[unused353] -[unused354] -[unused355] -[unused356] -[unused357] -[unused358] -[unused359] -[unused360] -[unused361] -[unused362] -[unused363] -[unused364] -[unused365] -[unused366] -[unused367] -[unused368] -[unused369] -[unused370] -[unused371] -[unused372] -[unused373] -[unused374] -[unused375] -[unused376] -[unused377] -[unused378] -[unused379] -[unused380] -[unused381] -[unused382] -[unused383] -[unused384] -[unused385] -[unused386] -[unused387] -[unused388] -[unused389] -[unused390] -[unused391] -[unused392] -[unused393] -[unused394] -[unused395] -[unused396] -[unused397] -[unused398] -[unused399] -[unused400] -[unused401] -[unused402] -[unused403] -[unused404] -[unused405] -[unused406] -[unused407] -[unused408] -[unused409] -[unused410] -[unused411] -[unused412] -[unused413] -[unused414] -[unused415] -[unused416] -[unused417] -[unused418] -[unused419] -[unused420] -[unused421] -[unused422] -[unused423] -[unused424] -[unused425] -[unused426] -[unused427] -[unused428] -[unused429] -[unused430] -[unused431] -[unused432] -[unused433] -[unused434] -[unused435] -[unused436] -[unused437] -[unused438] -[unused439] -[unused440] -[unused441] -[unused442] -[unused443] -[unused444] -[unused445] -[unused446] -[unused447] -[unused448] -[unused449] -[unused450] -[unused451] -[unused452] -[unused453] -[unused454] -[unused455] -[unused456] -[unused457] -[unused458] -[unused459] -[unused460] -[unused461] -[unused462] -[unused463] -[unused464] -[unused465] -[unused466] -[unused467] -[unused468] -[unused469] -[unused470] -[unused471] -[unused472] -[unused473] -[unused474] -[unused475] -[unused476] -[unused477] -[unused478] -[unused479] -[unused480] -[unused481] -[unused482] -[unused483] -[unused484] -[unused485] -[unused486] -[unused487] -[unused488] -[unused489] -[unused490] -[unused491] -[unused492] -[unused493] -[unused494] -[unused495] -[unused496] -[unused497] -[unused498] -[unused499] -[unused500] -[unused501] -[unused502] -[unused503] -[unused504] -[unused505] -[unused506] -[unused507] -[unused508] -[unused509] -[unused510] -[unused511] -[unused512] -[unused513] -[unused514] -[unused515] -[unused516] -[unused517] -[unused518] -[unused519] -[unused520] -[unused521] -[unused522] -[unused523] -[unused524] -[unused525] -[unused526] -[unused527] -[unused528] -[unused529] -[unused530] -[unused531] -[unused532] -[unused533] -[unused534] -[unused535] -[unused536] -[unused537] -[unused538] -[unused539] -[unused540] -[unused541] -[unused542] -[unused543] -[unused544] -[unused545] -[unused546] -[unused547] -[unused548] -[unused549] -[unused550] -[unused551] -[unused552] -[unused553] -[unused554] -[unused555] -[unused556] -[unused557] -[unused558] -[unused559] -[unused560] -[unused561] -[unused562] -[unused563] -[unused564] -[unused565] -[unused566] -[unused567] -[unused568] -[unused569] -[unused570] -[unused571] -[unused572] -[unused573] -[unused574] -[unused575] -[unused576] -[unused577] -[unused578] -[unused579] -[unused580] -[unused581] -[unused582] -[unused583] -[unused584] -[unused585] -[unused586] -[unused587] -[unused588] -[unused589] -[unused590] -[unused591] -[unused592] -[unused593] -[unused594] -[unused595] -[unused596] -[unused597] -[unused598] -[unused599] -[unused600] -[unused601] -[unused602] -[unused603] -[unused604] -[unused605] -[unused606] -[unused607] -[unused608] -[unused609] -[unused610] -[unused611] -[unused612] -[unused613] -[unused614] -[unused615] -[unused616] -[unused617] -[unused618] -[unused619] -[unused620] -[unused621] -[unused622] -[unused623] -[unused624] -[unused625] -[unused626] -[unused627] -[unused628] -[unused629] -[unused630] -[unused631] -[unused632] -[unused633] -[unused634] -[unused635] -[unused636] -[unused637] -[unused638] -[unused639] -[unused640] -[unused641] -[unused642] -[unused643] -[unused644] -[unused645] -[unused646] -[unused647] -[unused648] -[unused649] -[unused650] -[unused651] -[unused652] -[unused653] -[unused654] -[unused655] -[unused656] -[unused657] -[unused658] -[unused659] -[unused660] -[unused661] -[unused662] -[unused663] -[unused664] -[unused665] -[unused666] -[unused667] -[unused668] -[unused669] -[unused670] -[unused671] -[unused672] -[unused673] -[unused674] -[unused675] -[unused676] -[unused677] -[unused678] -[unused679] -[unused680] -[unused681] -[unused682] -[unused683] -[unused684] -[unused685] -[unused686] -[unused687] -[unused688] -[unused689] -[unused690] -[unused691] -[unused692] -[unused693] -[unused694] -[unused695] -[unused696] -[unused697] -[unused698] -[unused699] -[unused700] -[unused701] -[unused702] -[unused703] -[unused704] -[unused705] -[unused706] -[unused707] -[unused708] -[unused709] -[unused710] -[unused711] -[unused712] -[unused713] -[unused714] -[unused715] -[unused716] -[unused717] -[unused718] -[unused719] -[unused720] -[unused721] -[unused722] -[unused723] -[unused724] -[unused725] -[unused726] -[unused727] -[unused728] -[unused729] -[unused730] -[unused731] -[unused732] -[unused733] -[unused734] -[unused735] -[unused736] -[unused737] -[unused738] -[unused739] -[unused740] -[unused741] -[unused742] -[unused743] -[unused744] -[unused745] -[unused746] -[unused747] -[unused748] -[unused749] -[unused750] -[unused751] -[unused752] -[unused753] -[unused754] -[unused755] -[unused756] -[unused757] -[unused758] -[unused759] -[unused760] -[unused761] -[unused762] -[unused763] -[unused764] -[unused765] -[unused766] -[unused767] -[unused768] -[unused769] -[unused770] -[unused771] -[unused772] -[unused773] -[unused774] -[unused775] -[unused776] -[unused777] -[unused778] -[unused779] -[unused780] -[unused781] -[unused782] -[unused783] -[unused784] -[unused785] -[unused786] -[unused787] -[unused788] -[unused789] -[unused790] -[unused791] -[unused792] -[unused793] -[unused794] -[unused795] -[unused796] -[unused797] -[unused798] -[unused799] -[unused800] -[unused801] -[unused802] -[unused803] -[unused804] -[unused805] -[unused806] -[unused807] -[unused808] -[unused809] -[unused810] -[unused811] -[unused812] -[unused813] -[unused814] -[unused815] -[unused816] -[unused817] -[unused818] -[unused819] -[unused820] -[unused821] -[unused822] -[unused823] -[unused824] -[unused825] -[unused826] -[unused827] -[unused828] -[unused829] -[unused830] -[unused831] -[unused832] -[unused833] -[unused834] -[unused835] -[unused836] -[unused837] -[unused838] -[unused839] -[unused840] -[unused841] -[unused842] -[unused843] -[unused844] -[unused845] -[unused846] -[unused847] -[unused848] -[unused849] -[unused850] -[unused851] -[unused852] -[unused853] -[unused854] -[unused855] -[unused856] -[unused857] -[unused858] -[unused859] -[unused860] -[unused861] -[unused862] -[unused863] -[unused864] -[unused865] -[unused866] -[unused867] -[unused868] -[unused869] -[unused870] -[unused871] -[unused872] -[unused873] -[unused874] -[unused875] -[unused876] -[unused877] -[unused878] -[unused879] -[unused880] -[unused881] -[unused882] -[unused883] -[unused884] -[unused885] -[unused886] -[unused887] -[unused888] -[unused889] -[unused890] -[unused891] -[unused892] -[unused893] -[unused894] -[unused895] -[unused896] -[unused897] -[unused898] -[unused899] -[unused900] -[unused901] -[unused902] -[unused903] -[unused904] -[unused905] -[unused906] -[unused907] -[unused908] -[unused909] -[unused910] -[unused911] -[unused912] -[unused913] -[unused914] -[unused915] -[unused916] -[unused917] -[unused918] -[unused919] -[unused920] -[unused921] -[unused922] -[unused923] -[unused924] -[unused925] -[unused926] -[unused927] -[unused928] -[unused929] -[unused930] -[unused931] -[unused932] -[unused933] -[unused934] -[unused935] -[unused936] -[unused937] -[unused938] -[unused939] -[unused940] -[unused941] -[unused942] -[unused943] -[unused944] -[unused945] -[unused946] -[unused947] -[unused948] -[unused949] -[unused950] -[unused951] -[unused952] -[unused953] -[unused954] -[unused955] -[unused956] -[unused957] -[unused958] -[unused959] -[unused960] -[unused961] -[unused962] -[unused963] -[unused964] -[unused965] -[unused966] -[unused967] -[unused968] -[unused969] -[unused970] -[unused971] -[unused972] -[unused973] -[unused974] -[unused975] -[unused976] -[unused977] -[unused978] -[unused979] -[unused980] -[unused981] -[unused982] -[unused983] -[unused984] -[unused985] -[unused986] -[unused987] -[unused988] -[unused989] -[unused990] -[unused991] -[unused992] -[unused993] -` -¡ -¢ -¦ -¨ -ª -¬ -´ -¶ -½ -¾ -¿ -ð -þ -ħ -ı -ł -œ -ƒ -ɐ -ɑ -ɒ -ɕ -ɛ -ɣ -ɨ -ɪ -ɫ -ɬ -ɯ -ɲ -ɴ -ɹ -ɾ -ʀ -ʁ -ʂ -ʃ -ʉ -ʊ -ʋ -ʌ -ʎ -ʐ -ʑ -ʒ -ʔ -ʲ -ʳ -ʷ -ʸ -ʻ -ʼ -ʾ -ʿ -ˡ -ˣ -ˤ -ζ -ξ -щ -ъ -э -ю -ђ -є -ј -љ -њ -ћ -ӏ -ա -բ -գ -դ -ե -թ -ի -լ -կ -հ -մ -յ -ն -ո -պ -ս -վ -տ -ր -ւ -ք -־ -א -ב -ג -ד -ה -ו -ז -ח -ט -י -ך -כ -ל -ם -מ -ן -נ -ס -ע -ף -פ -ץ -צ -ק -ר -ש -ת -، -ء -ث -ج -ح -خ -ذ -ز -ش -ص -ض -ط -ظ -غ -ـ -ف -ق -ك -ى -ٹ -پ -چ -ک -گ -ں -ھ -ہ -ی -ے -अ -आ -उ -ए -क -ख -ग -च -ज -ट -ड -ण -त -थ -द -ध -न -प -ब -भ -म -य -र -ल -व -श -ष -स -ह -ा -ि -ी -ो -। -॥ -ং -অ -আ -ই -উ -এ -ও -ক -খ -গ -চ -ছ -জ -ট -ড -ণ -ত -থ -দ -ধ -ন -প -ব -ভ -ম -য -র -ল -শ -ষ -স -হ -া -ি -ী -ে -க -ச -ட -த -ந -ன -ப -ம -ய -ர -ல -ள -வ -ா -ி -ு -ே -ை -ನ -ರ -ಾ -ක -ය -ර -ල -ව -ා -ต -ท -พ -ล -ว -ส -། -ག -ང -ད -ན -པ -བ -མ -འ -ར -ལ -ས -မ -ა -ბ -გ -დ -ე -ვ -თ -ი -კ -ლ -მ -ნ -ო -რ -ს -ტ -უ -ᄊ -ᴬ -ᴮ -ᴰ -ᴵ -ᴺ -ᵀ -ᵇ -ᵈ -ᵖ -ᵗ -ᵢ -ᵣ -ᵤ -ᵥ -ᶜ -ᶠ -‐ -‑ -‒ -– -— -― -‘ -’ -‚ -“ -” -‡ -… -⁰ -⁴ -⁵ -⁶ -⁷ -⁸ -⁹ -⁻ -₀ -₅ -₆ -₇ -₈ -₉ -₊ -₍ -₎ -ₐ -ₑ -ₒ -ₓ -ₕ -ₖ -ₗ -ₘ -ₙ -ₚ -ₛ -ₜ -₤ -₩ -₱ -₹ -ℓ -ℝ -⅓ -⅔ -↦ -⇄ -⇌ -∂ -∅ -∆ -∇ -∈ -∗ -∘ -∧ -∨ -∪ -⊂ -⊆ -⊕ -⊗ -☉ -♭ -♯ -⟨ -⟩ -ⱼ -⺩ -⺼ -⽥ -亻 -宀 -彳 -忄 -扌 -氵 -疒 -糹 -訁 -辶 -阝 -龸 -fi -fl -had -were -which -him -their -been -would -then -them -could -during -through -between -while -later -around -did -such -being -used -against -many -both -these -known -until -even -didn -because -born -since -still -became -any -including -took -same -each -called -much -however -four -another -found -won -going -away -hand -several -following -released -played -began -district -those -held -own -early -league -government -came -based -thought -looked -along -went -few -father -former -located -got -though -every -century -without -within -building -large -named -started -once -should -built -british -death -moved -door -need -president -wasn -although -due -major -died -third -knew -asked -turned -wanted -together -received -son -served -different -behind -himself -felt -members -football -near -having -saw -mother -army -front -late -hands -put -division -across -told -often -ever -french -six -include -tell -among -species -really -according -half -original -gave -making -enough -opened -must -included -given -german -woman -community -might -million -court -short -round -seen -always -become -sure -almost -director -council -career -things -using -couldn -better -students -married -nothing -worked -others -record -anything -continued -give -military -established -returned -does -written -thing -feet -far -already -championship -western -department -role -various -production -television -produced -working -region -present -period -looking -least -total -england -wife -per -brother -soon -political -taken -created -further -able -reached -joined -upon -done -important -either -appeared -position -ground -lead -election -arms -police -instead -words -moment -someone -announced -less -wrote -past -followed -founded -finally -india -taking -records -considered -northern -toward -european -outside -described -track -playing -heard -professional -australia -miles -yet -trying -blood -southern -maybe -everything -mouth -race -recorded -above -daughter -points -middle -move -tried -elected -closed -ten -minister -chief -person -similar -brought -rest -formed -floor -doing -killed -training -needed -turn -finished -railway -rather -sent -example -ran -term -coming -currently -forces -despite -areas -fact -dead -originally -germany -probably -developed -pulled -stood -signed -songs -child -eventually -met -average -teams -minutes -current -kind -decided -usually -eastern -seemed -episode -bed -added -indian -route -available -throughout -addition -appointed -eight -construction -mean -remained -schools -sometimes -events -possible -australian -forward -debut -seat -performance -committee -features -character -herself -lot -russian -range -hours -sold -quickly -directed -guitar -performed -players -smile -myself -placed -province -towards -wouldn -leading -whole -designed -census -europe -attack -japanese -getting -alone -lower -wide -hospital -believe -changed -sister -gone -hadn -ship -studies -academy -shot -below -involved -kept -largest -especially -beginning -movement -section -female -professor -lord -longer -walked -actually -civil -families -thus -aircraft -completed -includes -captain -fight -vocals -featured -fourth -officer -hear -means -medical -groups -lips -competition -entire -lived -leaving -federal -tournament -passed -independent -kingdom -spent -fine -doesn -reported -fall -raised -itself -replaced -leader -theatre -whose -parents -spanish -canadian -degree -writing -awarded -higher -coast -provided -senior -organization -stopped -onto -countries -parts -conference -interest -saying -allowed -earlier -matter -winning -try -happened -moving -los -breath -nearly -mid -certain -italian -african -standing -fell -artist -shows -deal -mine -industry -everyone -republic -provide -student -primary -owned -older -heavy -1st -makes -attention -anyone -africa -stated -length -ended -fingers -command -staff -foreign -opening -governor -okay -medal -kill -introduced -chest -hell -feeling -success -meet -reason -meeting -novel -trade -buildings -guy -goal -native -husband -previously -entered -producer -operations -takes -covered -forced -roman -complete -successful -texas -cold -traditional -films -clear -approximately -nine -prince -question -tracks -ireland -regional -personal -operation -economic -holding -twenty -additional -hour -regular -historic -places -whom -shook -km² -secretary -prior -scored -units -ask -property -ready -immediately -month -listed -contract -themselves -lines -navy -writer -meant -runs -practice -championships -singer -commission -required -starting -generally -giving -attended -couple -stand -catholic -caught -executive -thinking -chair -quite -shoulder -hope -decision -plays -defeated -municipality -whether -offered -slowly -pain -direction -mission -mostly -noted -individual -managed -lives -plant -helped -except -studied -computer -figure -relationship -issue -significant -loss -smiled -gun -highest -male -bring -goals -mexico -problem -distance -commercial -completely -location -annual -famous -neck -caused -italy -understand -greek -highway -wrong -comes -appearance -issues -musical -companies -castle -income -assembly -bass -initially -parliament -artists -experience -particular -walk -foot -engineering -talking -dropped -boys -stars -remember -carried -train -stadium -angeles -evidence -becoming -assistant -soviet -upper -youth -reach -actor -numerous -nodded -arrived -minute -believed -complex -victory -associated -temple -chance -perhaps -bishop -launched -particularly -retired -subject -prize -contains -yeah -theory -empire -suddenly -waiting -trust -recording -terms -champion -religious -zealand -names -2nd -ancient -corner -represented -legal -justice -cause -watched -brothers -material -changes -simply -response -answer -historical -stories -straight -feature -increased -administration -virginia -activities -cultural -overall -winner -programs -basketball -legs -guard -cast -doctor -flight -results -remains -cost -effect -winter -larger -islands -problems -chairman -grew -commander -isn -failed -selected -hurt -fort -regiment -majority -plans -shown -pretty -irish -characters -directly -scene -likely -operated -allow -matches -looks -houses -fellow -marriage -rules -florida -expected -nearby -congress -peace -recent -wait -subsequently -variety -serving -agreed -poor -attempt -wood -democratic -rural -mile -appears -township -soldiers -##ized -pennsylvania -closer -fighting -claimed -score -physical -filled -genus -specific -sitting -mom -therefore -supported -status -fear -cases -meaning -wales -minor -spain -vice -parish -separate -horse -fifth -remaining -branch -presented -stared -uses -forms -baseball -exactly -choice -discovered -composed -truth -russia -dad -ring -referred -numbers -greater -metres -slightly -direct -increase -responsible -crew -rule -trees -troops -broke -goes -individuals -hundred -weight -creek -sleep -defense -provides -ordered -jewish -safe -judge -whatever -corps -realized -growing -cities -gaze -lies -spread -letter -showed -situation -mayor -transport -watching -workers -extended -expression -normal -chart -multiple -border -mrs -walls -piano -heat -cannot -earned -products -drama -era -authority -seasons -join -grade -difficult -territory -mainly -stations -squadron -stepped -iron -19th -serve -appear -speak -broken -charge -knowledge -kilometres -removed -ships -campus -pushed -britain -leaves -recently -boston -latter -acquired -poland -quality -officers -presence -planned -nations -mass -broadcast -influence -wild -emperor -electric -headed -ability -promoted -yellow -ministry -throat -smaller -politician -latin -spoke -cars -males -lack -acting -seeing -consists -estate -pressure -newspaper -olympics -conditions -beat -elements -walking -vote -needs -carolina -featuring -levels -francisco -purpose -females -dutch -duke -ahead -gas -safety -serious -turning -highly -lieutenant -firm -amount -mixed -proposed -perfect -agreement -affairs -3rd -seconds -contemporary -paid -prison -label -administrative -intended -constructed -academic -teacher -races -formerly -nation -issued -shut -drums -housing -seems -graduated -mentioned -picked -recognized -shortly -protection -picture -notable -elections -1980s -loved -percent -racing -elizabeth -volume -hockey -beside -settled -competed -replied -drew -actress -marine -scotland -steel -glanced -farm -risk -tonight -positive -singles -effects -gray -screen -residents -sides -none -secondary -literature -polish -destroyed -flying -founder -households -lay -reserve -industrial -younger -approach -appearances -ones -finish -powerful -fully -growth -honor -jersey -projects -revealed -infantry -pair -equipment -visit -evening -grant -effort -treatment -buried -republican -primarily -bottom -owner -1970s -israel -gives -remain -spot -produce -champions -accepted -ways -##ally -losing -split -capacity -basis -trial -questions -20th -guess -officially -memorial -naval -initial -##ization -whispered -median -engineer -sydney -columbia -strength -tears -senate -asian -draw -warm -supposed -transferred -leaned -candidate -escape -mountains -potential -activity -seem -traffic -murder -slow -orchestra -haven -agency -taught -website -comedy -unable -storm -planning -albums -rugby -environment -scientific -grabbed -protect -boat -typically -damage -principal -divided -dedicated -ohio -pick -fought -driver -empty -shoulders -sort -thank -berlin -prominent -account -freedom -necessary -efforts -headquarters -follows -alongside -suggested -operating -steps -technical -begin -easily -teeth -speaking -settlement -scale -renamed -enemy -semi -joint -compared -scottish -leadership -analysis -offers -georgia -pieces -captured -animal -deputy -organized -combined -method -challenge -1960s -huge -wants -battalion -sons -rise -crime -types -facilities -telling -platform -sit -1990s -tells -assigned -pull -commonly -alive -letters -concept -conducted -wearing -happen -bought -becomes -holy -gets -defeat -languages -purchased -occurred -titled -declared -applied -sciences -concert -sounds -jazz -brain -painting -fleet -tax -michigan -animals -leaders -episodes -birth -clubs -palace -critical -refused -fair -leg -laughed -returning -surrounding -participated -formation -lifted -pointed -connected -rome -medicine -laid -powers -tall -shared -focused -knowing -yards -entrance -falls -calling -sources -chosen -beneath -resources -yard -nominated -silence -defined -gained -thirty -bodies -adopted -christmas -widely -register -apart -iran -premier -serves -unknown -parties -generation -continues -fields -brigade -quiet -teaching -clothes -impact -weapons -partner -flat -theater -relations -plants -suffered -begins -seats -armed -models -worth -laws -communities -classes -background -knows -thanks -quarter -reaching -humans -carry -killing -format -setting -architecture -disease -railroad -possibly -arthur -thoughts -doors -density -crowd -illinois -stomach -tone -unique -reports -anyway -liberal -vehicle -thick -dry -drug -faced -largely -facility -theme -holds -creation -strange -colonel -revolution -politics -turns -silent -rail -relief -independence -combat -shape -determined -sales -learned -4th -finger -providing -heritage -fiction -situated -designated -allowing -hosted -sight -interview -estimated -reduced -toronto -footballer -keeping -guys -damn -claim -motion -sixth -stayed -rear -receive -handed -twelve -dress -audience -granted -brazil -spirit -##ated -noticed -olympic -representative -tight -trouble -reviews -drink -vampire -missing -roles -ranked -newly -household -finals -critics -phase -massachusetts -pilot -unlike -philadelphia -bright -guns -crown -organizations -roof -respectively -clearly -tongue -marked -circle -bronze -expanded -sexual -supply -yourself -inspired -labour -reference -draft -connection -reasons -driving -jesus -cells -entry -neither -trail -claims -atlantic -orders -labor -nose -afraid -identified -intelligence -calls -cancer -attacked -passing -positions -imperial -grey -swedish -avoid -extra -uncle -covers -allows -surprise -materials -fame -hunter -citizens -figures -environmental -confirmed -shit -titles -performing -difference -acts -attacks -existing -votes -opportunity -nor -entirely -trains -opposite -pakistan -develop -resulted -representatives -actions -reality -pressed -barely -conversation -faculty -northwest -ends -documentary -nuclear -stock -sets -eat -alternative -resulting -creating -surprised -cemetery -drop -finding -cricket -streets -tradition -ride -ear -explained -composer -injury -apartment -municipal -educational -occupied -netherlands -clean -billion -constitution -learn -maximum -classical -lose -opposition -ontario -hills -rolled -ending -drawn -permanent -lewis -sites -chamber -scoring -height -lyrics -staring -officials -snow -oldest -qualified -interior -apparently -succeeded -thousand -dinner -lights -existence -heavily -greatest -conservative -send -bowl -catch -duty -speech -authorities -princess -performances -versions -shall -graduate -pictures -effective -remembered -poetry -desk -crossed -starring -starts -passenger -sharp -acres -ass -weather -falling -rank -fund -supporting -adult -heads -southeast -lane -condition -transfer -prevent -regions -earl -federation -relatively -answered -besides -obtained -portion -reaction -liked -peak -counter -religion -chain -rare -convention -aid -lie -vehicles -perform -squad -wonder -lying -crazy -sword -attempted -centuries -weren -philosophy -interested -sweden -wolf -frequently -abandoned -literary -alliance -task -entitled -threw -promotion -tiny -soccer -visited -achieved -defence -internal -persian -methods -arrested -otherwise -programming -villages -elementary -districts -rooms -criminal -conflict -worry -trained -attempts -waited -signal -truck -subsequent -programme -communist -faith -sector -carrying -laugh -controlled -korean -showing -origin -fuel -evil -brief -identity -darkness -pool -missed -publication -wings -invited -briefly -standards -kissed -ideas -climate -causing -walter -worse -albert -winners -desire -aged -northeast -dangerous -gate -doubt -wooden -poet -rising -funding -communications -communication -violence -copies -prepared -investigation -skills -pulling -containing -ultimately -offices -singing -understanding -tomorrow -christ -ward -pope -stands -5th -flow -studios -aired -commissioned -contained -exist -americans -wrestling -approved -kid -employed -respect -suit -asking -increasing -frame -angry -selling -1950s -thin -finds -temperature -statement -ali -explain -inhabitants -towns -extensive -narrow -flowers -promise -somewhere -closely -bureau -cape -weekly -presidential -legislative -launch -founding -artillery -strike -un -institutions -roll -writers -landing -chose -anymore -attorney -billboard -receiving -agricultural -breaking -sought -dave -admitted -lands -mexican -##bury -specifically -hole -moscow -roads -accident -proved -struck -guards -stuff -slid -expansion -melbourne -opposed -sub -southwest -architect -failure -plane -tank -listen -regarding -wet -introduction -metropolitan -fighter -inch -grown -gene -anger -fixed -khan -domestic -worldwide -chapel -mill -functions -examples -developing -turkey -hits -pocket -antonio -papers -grow -unless -circuit -18th -concerned -attached -journalist -selection -journey -converted -provincial -painted -hearing -aren -bands -negative -aside -wondered -knight -lap -noise -billy -shooting -bedroom -priest -resistance -motor -homes -sounded -giant -scenes -equal -comic -patients -hidden -solid -actual -bringing -afternoon -touched -funds -consisted -marie -canal -treaty -turkish -recognition -residence -cathedral -broad -knees -incident -shaped -fired -norwegian -handle -cheek -contest -represent -representing -birds -advantage -emergency -wrapped -drawing -notice -broadcasting -somehow -bachelor -seventh -collected -registered -establishment -assumed -chemical -personnel -retirement -portuguese -wore -tied -device -threat -progress -advance -##ised -banks -hired -manchester -nfl -teachers -structures -forever -tennis -helping -saturday -applications -junction -incorporated -neighborhood -dressed -ceremony -influenced -hers -stairs -decades -inner -kansas -hung -hoped -gain -scheduled -downtown -engaged -austria -clock -norway -certainly -pale -victor -employees -plate -putting -surrounded -##ists -finishing -blues -tropical -minnesota -consider -philippines -accept -retrieved -concern -anderson -properties -institution -gordon -successfully -vietnam -backing -outstanding -muslim -crossing -folk -producing -usual -demand -occurs -observed -lawyer -educated -pleasure -budget -items -quietly -colorado -philip -typical -##worth -derived -survived -asks -mental -jake -jews -distinguished -sri -extremely -athletic -loud -thousands -worried -transportation -horses -weapon -arena -importance -users -objects -contributed -douglas -aware -senator -johnny -sisters -engines -flag -investment -samuel -shock -capable -clark -row -wheel -refers -familiar -biggest -wins -hate -maintained -drove -hamilton -expressed -injured -underground -churches -wars -tunnel -passes -stupid -agriculture -softly -cabinet -regarded -joining -indiana -dates -spend -behavior -woods -protein -gently -chase -morgan -mention -burning -wake -combination -occur -mirror -leads -indeed -impossible -paintings -covering -soldier -locations -attendance -sell -historian -wisconsin -invasion -argued -painter -diego -changing -egypt -experienced -inches -missouri -grounds -spoken -switzerland -reform -rolling -forget -massive -resigned -burned -tennessee -locked -values -improved -wounded -universe -sick -dating -facing -purchase -##pur -moments -merged -anniversary -coal -brick -understood -causes -dynasty -queensland -establish -stores -crisis -promote -hoping -cards -referee -extension -raise -arizona -improve -colonial -formal -charged -palm -hide -rescue -faces -feelings -candidates -juan -6th -courses -weekend -luke -cash -fallen -delivered -affected -installed -carefully -tries -hollywood -costs -lincoln -responsibility -shore -proper -normally -maryland -assistance -constant -offering -friendly -waters -persons -realize -contain -trophy -partnership -factor -musicians -bound -oregon -indicated -houston -medium -consisting -somewhat -cycle -beer -moore -frederick -gotten -worst -weak -approached -arranged -chin -loan -bond -fifteen -pattern -disappeared -translated -##zed -lip -arab -capture -interests -insurance -shifted -cave -prix -warning -sections -courts -coat -plot -smell -golf -favorite -maintain -knife -voted -degrees -finance -quebec -opinion -translation -manner -ruled -operate -productions -choose -musician -confused -tired -separated -stream -techniques -committed -attend -ranking -kings -throw -passengers -measure -horror -mining -sand -danger -salt -calm -decade -dam -require -runner -rush -associate -greece -rivers -consecutive -matthew -##ski -sighed -sq -documents -closing -tie -accused -islamic -distributed -directors -organisation -7th -breathing -mad -lit -arrival -concrete -taste -composition -shaking -faster -amateur -adjacent -stating -twin -flew -publications -obviously -ridge -storage -carl -pages -concluded -desert -driven -universities -ages -terminal -sequence -borough -constituency -cousin -economics -dreams -margaret -notably -reduce -montreal -17th -ears -saved -vocal -riding -roughly -threatened -meters -meanwhile -landed -compete -repeated -grass -czech -regularly -charges -sudden -appeal -solution -describes -classification -glad -parking -belt -physics -rachel -hungarian -participate -expedition -damaged -gift -childhood -fifty -mathematics -jumped -letting -defensive -mph -testing -hundreds -shoot -owners -matters -smoke -israeli -kentucky -dancing -mounted -grandfather -designs -profit -argentina -truly -lawrence -cole -begun -detroit -willing -branches -smiling -decide -miami -enjoyed -recordings -##dale -poverty -ethnic -arabic -accompanied -fishing -determine -residential -acid -returns -starred -strategy -forty -businesses -equivalent -commonwealth -distinct -ill -seriously -##ped -harris -replace -rio -imagine -formula -ensure -additionally -scheme -conservation -occasionally -purposes -feels -favor -1930s -contrast -hanging -hunt -movies -instruments -victims -danish -christopher -busy -demon -sugar -earliest -colony -studying -duties -belgium -slipped -carter -visible -stages -iraq -commune -forming -continuing -talked -counties -legend -bathroom -option -tail -clay -daughters -afterwards -severe -jaw -visitors -devices -aviation -entering -subjects -temporary -swimming -forth -smooth -bush -operates -rocks -movements -signs -eddie -voices -honorary -memories -dallas -measures -racial -promised -harvard -16th -parliamentary -indicate -benefit -flesh -dublin -louisiana -patient -sleeping -membership -coastal -medieval -wanting -element -scholars -rice -limit -survive -makeup -rating -definitely -collaboration -obvious -baron -birthday -linked -soil -diocese -ncaa -offensive -shouldn -waist -plain -ross -organ -resolution -manufacturing -adding -relative -kennedy -whilst -moth -gardens -crash -heading -partners -credited -carlos -moves -cable -marshall -depending -bottle -represents -rejected -responded -existed -denmark -##ating -treated -graham -routes -talent -commissioner -drugs -secure -tests -reign -restored -photography -contributions -oklahoma -designer -disc -grin -seattle -robin -paused -atlanta -unusual -praised -las -laughing -satellite -hungary -visiting -interesting -factors -deck -poems -norman -##water -stuck -speaker -rifle -premiered -comics -actors -reputation -eliminated -8th -ceiling -prisoners -leather -austin -mississippi -rapidly -admiral -parallel -charlotte -guilty -tools -gender -divisions -fruit -laboratory -nelson -marry -rapid -aunt -tribe -requirements -aspects -suicide -amongst -adams -bone -ukraine -kick -sees -edinburgh -clothing -column -rough -gods -hunting -broadway -gathered -concerns -spending -ty -12th -snapped -requires -solar -bones -cavalry -iowa -drinking -waste -franklin -charity -thompson -stewart -tip -landscape -enjoy -singh -poem -listening -eighth -fred -differences -adapted -bomb -ukrainian -surgery -corporate -masters -anywhere -waves -odd -portugal -orleans -dick -debate -kent -eating -puerto -cleared -expect -cinema -guitarist -blocks -electrical -agree -involving -depth -dying -panel -struggle -peninsula -adults -novels -emerged -vienna -debuted -shoes -tamil -songwriter -meets -prove -beating -instance -heaven -scared -sending -marks -artistic -passage -superior -significantly -retained -##izing -technique -cheeks -warren -maintenance -destroy -extreme -allied -appearing -fill -advice -alabama -qualifying -policies -cleveland -hat -battery -authors -10th -soundtrack -acted -dated -lb -glance -equipped -coalition -funny -outer -ambassador -roy -possibility -couples -campbell -loose -ethan -supplies -gonna -monster -shake -agents -frequency -springs -dogs -practices -gang -plastic -easier -suggests -gulf -blade -exposed -colors -industries -markets -nervous -electoral -charts -legislation -ownership -##idae -appointment -shield -assault -socialist -abbey -monument -license -throne -employment -replacement -charter -suffering -accounts -oak -connecticut -strongly -wright -colour -13th -context -welsh -networks -voiced -gabriel -forehead -manage -schedule -totally -remix -forests -occupation -print -nicholas -brazilian -strategic -vampires -engineers -roots -seek -correct -instrumental -und -alfred -backed -stanley -robinson -traveled -wayne -austrian -achieve -exit -rates -strip -whereas -sing -deeply -adventure -bobby -jamie -careful -components -cap -useful -personality -knee -pushing -hosts -protest -ottoman -symphony -boundary -processes -considering -considerable -tons -cooper -trading -conduct -illegal -revolutionary -definition -harder -jacob -circumstances -destruction -popularity -grip -classified -liverpool -baltimore -flows -seeking -honour -approval -mechanical -till -happening -statue -critic -increasingly -immediate -describe -commerce -stare -indonesia -meat -rounds -boats -baker -orthodox -depression -formally -worn -naked -muttered -sentence -11th -document -criticism -wished -vessel -spiritual -bent -virgin -minimum -murray -lunch -danny -printed -compilation -keyboards -blow -belonged -raising -cutting -pittsburgh -9th -shadows -hated -indigenous -jon -15th -barry -scholar -oliver -stick -susan -meetings -attracted -spell -romantic -ye -demanded -customers -logan -revival -keys -modified -commanded -jeans -upset -phil -detective -hiding -resident -##bly -experiences -diamond -defeating -coverage -lucas -external -parks -franchise -helen -bible -successor -percussion -celebrated -lift -clan -romania -##ied -mills -nobody -achievement -shrugged -fault -rhythm -initiative -breakfast -carbon -lasted -violent -wound -killer -gradually -filmed -°c -processing -remove -criticized -guests -sang -chemistry -legislature -##bridge -uniform -escaped -integrated -proposal -purple -denied -liquid -influential -morris -nights -stones -intense -experimental -twisted -pace -nazi -mitchell -ny -blind -reporter -newspapers -14th -centers -burn -basin -forgotten -surviving -filed -collections -monastery -losses -manual -couch -description -appropriate -merely -missions -sebastian -restoration -replacing -triple -elder -julia -warriors -benjamin -julian -convinced -stronger -amazing -declined -versus -merchant -happens -output -finland -bare -barbara -absence -ignored -dawn -injuries -producers -luis -##ities -kw -admit -expensive -electricity -exception -symbol -ladies -shower -sheriff -characteristics -##je -aimed -button -ratio -effectively -summit -angle -jury -bears -foster -vessels -pants -executed -evans -dozen -advertising -kicked -patrol -competitions -lifetime -principles -athletics -birmingham -sponsored -rob -nomination -acoustic -creature -longest -credits -harbor -dust -josh -territories -milk -infrastructure -completion -thailand -indians -leon -archbishop -assist -pitch -blake -arrangement -girlfriend -serbian -operational -hence -sad -scent -fur -sessions -refer -rarely -exists -1892 -scientists -dirty -penalty -burst -portrait -seed -pole -limits -rival -stable -grave -constitutional -alcohol -arrest -flower -mystery -devil -architectural -relationships -greatly -habitat -##istic -larry -progressive -remote -cotton -preserved -reaches -cited -vast -scholarship -decisions -teach -editions -knocked -eve -searching -partly -participation -animated -fate -excellent -alternate -saints -youngest -climbed -suggest -discussion -staying -choir -lakes -jacket -revenue -nevertheless -peaked -instrument -wondering -annually -managing -neil -1891 -signing -terry -apply -clinical -brooklyn -aim -catherine -fuck -farmers -figured -ninth -pride -hugh -ordinary -involvement -comfortable -shouted -encouraged -representation -sharing -panic -exact -cargo -competing -fat -cried -1920s -occasions -cabin -borders -utah -marcus -##isation -badly -muscles -victorian -transition -warner -bet -permission -slave -terrible -similarly -shares -seth -uefa -possession -medals -benefits -colleges -lowered -perfectly -transit -##kar -publisher -##ened -harrison -deaths -elevation -asleep -machines -sigh -ash -hardly -argument -occasion -parent -decline -contribution -concentration -opportunities -hispanic -guardian -extent -emotions -hips -mason -volumes -bloody -controversy -diameter -steady -mistake -phoenix -identify -violin -departure -richmond -spin -funeral -enemies -1864 -literally -connor -random -sergeant -grab -confusion -1865 -transmission -informed -leaning -sacred -suspended -thinks -gates -portland -luck -agencies -yours -hull -expert -muscle -layer -practical -sculpture -jerusalem -latest -lloyd -statistics -deeper -recommended -warrior -arkansas -mess -supports -greg -eagle -recovered -rated -concerts -rushed -stops -eggs -premiere -keith -delhi -turner -pit -affair -belief -paint -##zing -victim -withdrew -bonus -styles -fled -glasgow -technologies -funded -adaptation -portrayed -cooperation -supporters -judges -bernard -hallway -ralph -graduating -controversial -distant -continental -spider -bite -recognize -intention -mixing -egyptian -bow -tourism -suppose -claiming -dominated -participants -nurse -partially -tape -psychology -essential -touring -duo -voting -civilian -emotional -channels -apparent -hebrew -1887 -tommy -carrier -intersection -beast -hudson -bench -discuss -costa -##ered -detailed -behalf -drivers -unfortunately -obtain -rocky -##dae -siege -friendship -1861 -hang -governments -collins -respond -wildlife -preferred -operator -laura -pregnant -videos -dennis -suspected -boots -instantly -weird -automatic -businessman -alleged -placing -throwing -mood -1862 -perry -venue -jet -remainder -passion -biological -boyfriend -1863 -dirt -buffalo -ron -segment -abuse -genre -thrown -stroke -colored -stress -exercise -displayed -struggled -abroad -dramatic -wonderful -thereafter -madrid -component -widespread -##sed -tale -citizen -todd -vancouver -overseas -forcing -crying -descent -discussed -substantial -ranks -regime -provinces -drum -zane -tribes -proof -researchers -volunteer -manor -silk -milan -donated -allies -venture -principle -delivery -enterprise -bars -traditionally -witch -reminded -copper -pete -inter -colin -grinned -elsewhere -competitive -frequent -scream -tension -texts -submarine -finnish -defending -defend -pat -detail -affiliated -stuart -themes -periods -tool -belgian -ruling -crimes -answers -folded -licensed -demolished -hans -lucy -1881 -lion -traded -photographs -writes -craig -trials -generated -beth -noble -debt -percentage -yorkshire -erected -viewed -grades -confidence -ceased -islam -telephone -retail -chile -m² -roberts -sixteen -commented -hampshire -innocent -dual -pounds -checked -regulations -afghanistan -sung -rico -liberty -assets -bigger -options -angels -relegated -tribute -wells -attending -leaf -romanian -monthly -patterns -gmina -madison -hurricane -rev -##ians -bristol -elite -valuable -disaster -democracy -awareness -germans -freyja -loop -absolutely -paying -populations -maine -sole -prayer -spencer -releases -doorway -bull -lover -midnight -conclusion -thirteen -mediterranean -nhl -proud -sample -##hill -drummer -guinea -murphy -climb -instant -attributed -horn -ain -railways -autumn -ferry -opponent -traveling -secured -corridor -stretched -tales -sheet -trinity -cattle -helps -indicates -manhattan -murdered -fitted -gentle -grandmother -mines -shocked -vegas -produces -caribbean -belong -continuous -desperate -drunk -historically -trio -waved -raf -dealing -nathan -murmured -interrupted -residing -scientist -pioneer -harold -aaron -delta -attempting -minority -believes -chorus -tend -lots -eyed -indoor -load -shots -updated -jail -concerning -connecting -wealth -slaves -arrive -rangers -sufficient -rebuilt -##wick -cardinal -flood -muhammad -whenever -relation -runners -moral -repair -viewers -arriving -revenge -punk -assisted -bath -fairly -breathe -lists -innings -illustrated -whisper -nearest -voters -clinton -ties -ultimate -screamed -beijing -lions -andre -fictional -gathering -comfort -radar -suitable -dismissed -hms -ban -pine -wrist -atmosphere -voivodeship -bid -timber -##ned -giants -cameron -recovery -uss -identical -categories -switched -serbia -laughter -noah -ensemble -therapy -peoples -touching -##off -locally -pearl -platforms -everywhere -ballet -tables -lanka -herbert -outdoor -toured -derek -1883 -spaces -contested -swept -1878 -exclusive -slight -connections -winds -prisoner -collective -bangladesh -tube -publicly -wealthy -isolated -insisted -fortune -ticket -spotted -reportedly -animation -enforcement -tanks -decides -wider -lowest -owen -nod -hitting -gregory -furthermore -magazines -fighters -solutions -pointing -requested -peru -reed -chancellor -knights -mask -worker -eldest -flames -reduction -volunteers -reporting -wire -advisory -endemic -origins -settlers -pursue -knock -consumer -1876 -eu -compound -creatures -mansion -sentenced -ivan -deployed -guitars -frowned -involves -mechanism -kilometers -perspective -shops -terminus -duncan -alien -fist -bridges -##pers -heroes -derby -swallowed -patent -sara -illness -characterized -adventures -slide -hawaii -jurisdiction -organised -adelaide -walks -biology -rogers -swing -tightly -boundaries -prepare -implementation -stolen -certified -colombia -edwards -garage -recalled -rage -harm -nigeria -breast -furniture -pupils -settle -cuba -balls -alaska -21st -linear -thrust -celebration -latino -genetic -terror -##ening -lightning -fee -witness -lodge -establishing -skull -earning -hood -rebellion -sporting -warned -missile -devoted -activist -porch -worship -fourteen -package -decorated -##shire -housed -chess -sailed -doctors -oscar -joan -treat -garcia -harbour -jeremy -traditions -dominant -jacques -##gon -relocated -1879 -amendment -sized -companion -simultaneously -volleyball -spun -acre -increases -stopping -loves -belongs -affect -drafted -tossed -scout -battles -1875 -filming -shoved -munich -tenure -vertical -romance -argue -craft -ranging -opens -honest -tyler -yesterday -muslims -reveal -snake -immigrants -radical -screaming -speakers -firing -saving -belonging -ease -lighting -prefecture -blame -farmer -hungry -grows -rubbed -beam -sur -subsidiary -armenian -dropping -conventional -qualify -spots -sweat -festivals -immigration -physician -discover -exposure -sandy -explanation -isaac -implemented -##fish -hart -initiated -stakes -presents -heights -householder -pleased -tourist -regardless -slip -closest -surely -sultan -brings -riley -preparation -aboard -slammed -baptist -experiment -ongoing -interstate -organic -playoffs -1877 -hindu -tours -tier -plenty -arrangements -talks -trapped -excited -sank -athens -1872 -denver -welfare -suburb -athletes -trick -diverse -belly -exclusively -yelled -conversion -1874 -internationally -computers -conductor -abilities -sensitive -dispute -measured -globe -rocket -prices -amsterdam -flights -tigers -municipalities -emotion -references -explains -airlines -manufactured -archaeological -1873 -interpretation -devon -##ites -settlements -kissing -absolute -improvement -impressed -barcelona -sullivan -jefferson -towers -jesse -julie -grandson -gauge -regard -rings -interviews -trace -raymond -thumb -departments -burns -serial -bulgarian -scores -demonstrated -1866 -kyle -alberta -underneath -romanized -relieved -acquisition -phrase -cliff -reveals -cuts -merger -custom -nee -gilbert -graduation -assessment -difficulty -demands -swung -democrat -commons -1940s -grove -completing -focuses -sum -substitute -bearing -stretch -reception -reflected -essentially -destination -pairs -##ched -survival -resource -##bach -promoting -doubles -messages -tear -##fully -parade -florence -harvey -incumbent -partial -pedro -frozen -procedure -olivia -controls -shelter -personally -temperatures -brisbane -tested -sits -marble -comprehensive -oxygen -leonard -##kov -inaugural -iranian -referring -quarters -attitude -mainstream -lined -mars -dakota -norfolk -unsuccessful -explosion -helicopter -congressional -##sing -inspector -bitch -seal -departed -divine -coaching -examination -punishment -manufacturer -sink -columns -unincorporated -signals -nevada -squeezed -dylan -dining -martial -manuel -eighteen -elevator -brushed -plates -ministers -congregation -slept -specialized -taxes -restricted -negotiations -likes -statistical -arnold -inspiration -execution -bold -intermediate -significance -margin -ruler -wheels -gothic -intellectual -dependent -listened -eligible -buses -widow -syria -earn -cincinnati -collapsed -recipient -secrets -accessible -philippine -maritime -goddess -clerk -surrender -breaks -playoff -ideal -beetle -aspect -soap -regulation -strings -expand -anglo -shorter -crosses -retreat -tough -coins -wallace -directions -pressing -shipping -locomotives -comparison -topics -nephew -distinction -honors -travelled -sierra -ibn -fortress -recognised -carved -1869 -clients -intent -coaches -describing -bread -##ington -beaten -northwestern -merit -collapse -challenges -historians -objective -submitted -virus -attacking -drake -assume -diseases -stem -leeds -farming -glasses -visits -nowhere -fellowship -relevant -carries -restaurants -experiments -constantly -bases -targets -shah -tenth -opponents -verse -territorial -writings -corruption -instruction -inherited -reverse -emphasis -employee -arch -keeps -rabbi -watson -payment -uh -nancy -##tre -venice -fastest -sexy -banned -adrian -properly -ruth -touchdown -dollar -boards -metre -circles -edges -favour -travels -liberation -scattered -firmly -holland -permitted -diesel -kenya -den -originated -demons -resumed -dragged -rider -servant -blinked -extend -torn -##sey -input -meal -everybody -cylinder -kinds -camps -bullet -logic -croatian -evolved -healthy -fool -wise -preserve -pradesh -respective -artificial -gross -corresponding -convicted -cage -caroline -dialogue -##dor -narrative -stranger -mario -christianity -failing -trent -commanding -buddhist -1848 -maurice -focusing -yale -bike -altitude -mouse -revised -##sley -veteran -pulls -theology -crashed -campaigns -legion -##ability -drag -excellence -customer -cancelled -intensity -excuse -liga -participating -contributing -printing -##burn -variable -curious -legacy -renaissance -symptoms -binding -vocalist -dancer -grammar -gospel -democrats -enters -diplomatic -hitler -clouds -mathematical -quit -defended -oriented -##heim -fundamental -hardware -impressive -equally -convince -confederate -guilt -chuck -sliding -magnetic -narrowed -petersburg -bulgaria -otto -phd -skill -hopes -pitcher -reservoir -hearts -automatically -expecting -mysterious -bennett -extensively -imagined -seeds -monitor -fix -##ative -journalism -struggling -signature -ranch -encounter -photographer -observation -protests -influences -calendar -cruz -croatia -locomotive -hughes -naturally -shakespeare -basement -hook -uncredited -faded -theories -approaches -dare -phillips -filling -fury -obama -efficient -arc -deliver -breeding -inducted -leagues -efficiency -axis -montana -eagles -##ked -supplied -instructions -karen -picking -indicating -trap -anchor -practically -christians -tomb -vary -occasional -electronics -lords -readers -newcastle -faint -innovation -collect -situations -engagement -claude -mixture -##feld -peer -tissue -lean -°f -floors -architects -reducing -rope -1859 -ottawa -##har -samples -banking -declaration -proteins -resignation -francois -saudi -advocate -exhibited -armor -twins -divorce -##ras -abraham -reviewed -temporarily -matrix -physically -pulse -curled -difficulties -bengal -usage -##ban -riders -certificate -holes -warsaw -distinctive -mutual -1857 -customs -circular -eugene -removal -loaded -mere -vulnerable -depicted -generations -dame -heir -enormous -lightly -climbing -pitched -lessons -pilots -nepal -preparing -brad -louise -renowned -liam -##ably -shaw -brilliant -bills -##nik -fucking -mainland -pleasant -seized -veterans -jerked -fail -brush -radiation -stored -warmth -southeastern -nate -sin -raced -berkeley -joke -athlete -designation -trunk -roland -qualification -heels -artwork -receives -judicial -reserves -##bed -woke -installation -abu -floating -fake -lesser -excitement -interface -concentrated -addressed -characteristic -amanda -saxophone -monk -releasing -egg -dies -interaction -defender -outbreak -glory -loving -sequel -consciousness -awake -ski -enrolled -handling -rookie -brow -somebody -biography -warfare -amounts -contracts -presentation -fabric -dissolved -challenged -meter -psychological -elevated -rally -accurate -##tha -hospitals -undergraduate -specialist -venezuela -exhibit -shed -nursing -protestant -fluid -structural -footage -jared -consistent -prey -##ska -succession -reflect -exile -lebanon -wiped -suspect -shanghai -resting -integration -preservation -marvel -variant -pirates -sheep -rounded -capita -sailing -colonies -manuscript -deemed -variations -clarke -functional -emerging -boxing -relaxed -curse -azerbaijan -heavyweight -nickname -editorial -rang -grid -tightened -earthquake -flashed -miguel -rushing -##ches -improvements -boxes -brooks -consumption -molecular -felix -societies -repeatedly -variation -aids -civic -graphics -professionals -realm -autonomous -receiver -delayed -workshop -militia -chairs -canyon -harsh -extending -lovely -happiness -##jan -stake -eyebrows -embassy -wellington -hannah -corners -bishops -swear -cloth -contents -namely -commenced -1854 -stanford -nashville -courage -graphic -commitment -garrison -hamlet -clearing -rebels -attraction -literacy -cooking -ruins -temples -jenny -humanity -celebrate -hasn -freight -sixty -rebel -bastard -newton -deer -##ges -##ching -smiles -delaware -singers -approaching -assists -flame -boulevard -barrel -planted -pursuit -consequences -shallow -invitation -rode -depot -ernest -kane -rod -concepts -preston -topic -chambers -striking -blast -arrives -descendants -montgomery -ranges -worlds -chaos -praise -fewer -1855 -sanctuary -mud -programmes -maintaining -harper -bore -handsome -closure -tournaments -nebraska -linda -facade -puts -satisfied -argentine -dale -cork -dome -panama -##yl -1858 -tasks -experts -##ates -feeding -equation -engage -bryan -um -quartet -disbanded -sheffield -blocked -gasped -delay -kisses -connects -##non -sts -poured -creator -publishers -guided -ellis -extinct -hug -gaining -##ord -complicated -poll -clenched -investigate -thereby -quantum -spine -cdp -humor -kills -administered -semifinals -encountered -ignore -commentary -##maker -bother -roosevelt -plains -halfway -flowing -cultures -crack -imprisoned -neighboring -airline -gather -wolves -marathon -transformed -cruise -organisations -punch -exhibitions -numbered -alarm -ratings -daddy -silently -##stein -queens -colours -impression -guidance -tactical -##rat -marshal -della -arrow -rested -feared -tender -owns -bitter -advisor -escort -##ides -spare -farms -grants -dragons -encourage -colleagues -cameras -sucked -pile -spirits -prague -statements -suspension -landmark -fence -torture -recreation -bags -permanently -survivors -pond -spy -predecessor -bombing -coup -protecting -transformation -glow -##lands -dug -priests -andrea -feat -barn -jumping -##ologist -casualties -stern -auckland -pipe -serie -revealing -trevor -mercy -spectrum -consist -governing -collaborated -possessed -epic -comprises -blew -shane -lopez -honored -magical -sacrifice -judgment -perceived -hammer -baronet -tune -das -missionary -sheets -neutral -oral -threatening -attractive -shade -aims -seminary -estates -1856 -michel -wounds -refugees -manufacturers -mercury -syndrome -porter -##iya -##din -hamburg -identification -upstairs -purse -widened -pause -cared -breathed -affiliate -santiago -prevented -celtic -fisher -recruited -byzantine -reconstruction -farther -diet -sake -spite -sensation -blank -separation -##hon -vladimir -armies -anime -accommodate -orbit -cult -sofia -##ify -founders -sustained -disorder -honours -northeastern -mia -crops -violet -threats -blanket -fires -canton -followers -southwestern -prototype -voyage -assignment -altered -moderate -protocol -pistol -questioned -brass -lifting -1852 -math -authored -doug -dimensional -dynamic -1851 -pronounced -grateful -quest -uncomfortable -boom -presidency -stevens -relating -politicians -barrier -quinn -diana -mosque -tribal -palmer -portions -sometime -chester -treasure -bend -millions -reforms -registration -consequently -monitoring -ate -preliminary -brandon -invented -eaten -exterior -intervention -ports -documented -displays -lecture -sally -favourite -vermont -invisible -isle -breed -journalists -relay -speaks -backward -explore -midfielder -actively -stefan -procedures -cannon -blond -kenneth -centered -servants -chains -libraries -malcolm -essex -henri -slavery -##hal -facts -fairy -coached -cassie -cats -washed -cop -announcement -2000s -vinyl -activated -marco -frontier -growled -curriculum -##das -loyal -accomplished -leslie -ritual -kenny -vii -napoleon -hollow -hybrid -jungle -stationed -friedrich -counted -##ulated -platinum -theatrical -seated -col -rubber -glen -diversity -healing -extends -provisions -administrator -columbus -tributary -assured -##uous -prestigious -examined -lectures -grammy -ronald -associations -bailey -allan -essays -flute -believing -consultant -proceedings -travelling -1853 -kerala -yugoslavia -buddy -methodist -burial -centres -batman -discontinued -dock -stockholm -lungs -severely -citing -manga -steal -mumbai -iraqi -robot -celebrity -bride -broadcasts -abolished -pot -joel -overhead -franz -packed -reconnaissance -johann -acknowledged -introduce -handled -doctorate -developments -drinks -alley -palestine -##aki -proceeded -recover -bradley -grain -patch -afford -infection -nationalist -legendary -interchange -virtually -gen -gravity -exploration -amber -vital -wishes -powell -doctrine -elbow -screenplay -##bird -contribute -indonesian -creates -enzyme -kylie -discipline -drops -manila -hunger -layers -suffer -fever -bits -monica -keyboard -manages -##hood -searched -appeals -##bad -testament -grande -reid -##war -beliefs -congo -requiring -casey -1849 -regret -streak -rape -depends -syrian -sprint -pound -tourists -upcoming -pub -tense -##els -practiced -nationwide -guild -motorcycle -liz -##zar -chiefs -desired -elena -precious -absorbed -relatives -booth -pianist -##mal -citizenship -exhausted -wilhelm -##ceae -##hed -noting -quarterback -urge -hectares -##gue -holly -blonde -davies -parked -sustainable -stepping -twentieth -airfield -nest -chip -##nell -shaft -paulo -requirement -paradise -tobacco -trans -renewed -vietnamese -suggesting -catching -holmes -enjoying -trips -colt -holder -butterfly -nerve -reformed -cherry -bowling -trailer -carriage -goodbye -appreciate -toy -joshua -interactive -enabled -involve -##kan -collar -determination -bunch -recall -shorts -superintendent -episcopal -frustration -giovanni -nineteenth -laser -privately -array -circulation -##ovic -armstrong -deals -painful -permit -discrimination -aires -retiring -cottage -horizon -ellen -jamaica -ripped -fernando -chapters -patron -lecturer -behaviour -genes -georgian -export -solomon -rivals -seventeen -rodriguez -princeton -independently -sox -1847 -arguing -entity -casting -hank -criteria -oakland -geographic -milwaukee -reflection -expanding -conquest -dubbed -halt -brave -brunswick -arched -curtis -divorced -predominantly -somerset -streams -ugly -zoo -horrible -curved -buenos -fierce -dictionary -vector -theological -unions -handful -stability -punjab -segments -altar -ignoring -gesture -monsters -pastor -thighs -unexpected -operators -abruptly -coin -compiled -associates -improving -migration -compact -collegiate -quarterfinals -roster -restore -assembled -hurry -oval -##cies -1846 -flags -martha -victories -sharply -##rated -argues -deadly -drawings -symbols -performer -griffin -restrictions -editing -andrews -journals -arabia -compositions -dee -pierce -removing -hindi -casino -runway -civilians -minds -##zation -refuge -rent -retain -potentially -conferences -suburban -conducting -descended -massacre -ammunition -terrain -fork -souls -counts -chelsea -durham -drives -cab -perth -realizing -palestinian -finn -simpson -##dal -betty -moreover -particles -cardinals -tent -evaluation -extraordinary -inscription -wednesday -chloe -maintains -panels -ashley -trucks -##nation -cluster -sunlight -strikes -zhang -dialect -tucked -collecting -##mas -##sville -quoted -evan -franco -aria -buying -cleaning -closet -provision -apollo -clinic -rat -necessarily -##ising -venues -flipped -cent -spreading -trustees -checking -authorized -disappointed -##ado -notion -duration -trumpet -hesitated -topped -brussels -rolls -theoretical -hint -define -aggressive -repeat -wash -peaceful -optical -width -allegedly -mcdonald -strict -##illa -investors -jam -witnesses -sounding -miranda -michelle -hugo -harmony -valid -lynn -glared -nina -headquartered -diving -boarding -gibson -albanian -marsh -routine -dealt -enhanced -intelligent -substance -targeted -enlisted -discovers -spinning -observations -pissed -smoking -capitol -varied -costume -seemingly -indies -compensation -surgeon -thursday -arsenal -westminster -suburbs -rid -anglican -##ridge -knots -foods -alumni -lighter -fraser -whoever -portal -scandal -gavin -advised -instructor -flooding -terrorist -teenage -interim -senses -duck -teen -thesis -abby -eager -overcome -newport -glenn -rises -shame -prompted -priority -forgot -bomber -nicolas -protective -cartoon -katherine -breeze -lonely -trusted -henderson -richardson -relax -palms -remarkable -legends -cricketer -essay -ordained -edmund -rifles -trigger -##uri -##away -sail -alert -1830 -audiences -penn -sussex -siblings -pursued -indianapolis -resist -rosa -consequence -succeed -avoided -1845 -##ulation -inland -##tie -##nna -counsel -profession -chronicle -hurried -##una -eyebrow -eventual -bleeding -innovative -cure -committees -accounting -scope -hardy -heather -tenor -gut -herald -codes -tore -scales -wagon -luxury -tin -prefer -fountain -triangle -bonds -darling -convoy -dried -traced -beings -troy -accidentally -slam -findings -smelled -joey -lawyers -outcome -steep -bosnia -configuration -shifting -toll -brook -performers -lobby -philosophical -construct -shrine -aggregate -cox -phenomenon -savage -insane -solely -reynolds -nationally -holdings -consideration -enable -edgar -fights -relegation -chances -atomic -hub -conjunction -awkward -reactions -currency -finale -kumar -underwent -steering -elaborate -gifts -comprising -melissa -veins -reasonable -sunshine -solve -trails -inhabited -elimination -ethics -huh -ana -molly -consent -apartments -layout -marines -hunters -bulk -##oma -hometown -##wall -##mont -cracked -reads -neighbouring -withdrawn -admission -wingspan -damned -anthology -lancashire -brands -batting -forgive -cuban -awful -##lyn -dimensions -imagination -dante -tracking -desperately -goalkeeper -##yne -groaned -workshops -confident -burton -gerald -milton -circus -uncertain -slope -copenhagen -sophia -fog -philosopher -portraits -accent -cycling -varying -gripped -larvae -garrett -specified -scotia -mature -luther -kurt -rap -##kes -aerial -ferdinand -heated -transported -##shan -safely -nonetheless -##orn -##gal -motors -demanding -##sburg -startled -##brook -ally -generate -caps -ghana -stained -mentions -beds -afterward -##bling -utility -##iro -richards -1837 -conspiracy -conscious -shining -footsteps -observer -cyprus -urged -loyalty -developer -probability -olive -upgraded -gym -miracle -insects -graves -1844 -ourselves -hydrogen -katie -tickets -poets -planes -prevention -witnessed -dense -jin -randy -tang -warehouse -monroe -archived -elderly -investigations -alec -granite -mineral -conflicts -controlling -aboriginal -mechanics -stan -stark -rhode -skirt -est -bombs -respected -##horn -imposed -limestone -deny -nominee -memphis -grabbing -disabled -amusement -frankfurt -corn -referendum -varies -slowed -disk -firms -unconscious -incredible -clue -sue -##zhou -twist -##cio -joins -idaho -chad -developers -computing -destroyer -mortal -tucker -kingston -choices -carson -whitney -geneva -pretend -dimension -staged -plateau -maya -##une -freestyle -rovers -##ids -tristan -classroom -prospect -##hus -honestly -diploma -lied -thermal -auxiliary -feast -unlikely -iata -morocco -pounding -treasury -lithuania -considerably -1841 -dish -1812 -geological -matching -stumbled -destroying -marched -brien -advances -nicole -settling -measuring -directing -##mie -tuesday -bassist -capabilities -stunned -fraud -torpedo -##phone -anton -wisdom -surveillance -ruined -##ulate -lawsuit -healthcare -theorem -halls -trend -aka -horizontal -dozens -acquire -lasting -swim -hawk -gorgeous -fees -vicinity -decrease -adoption -tactics -##ography -pakistani -##ole -draws -##hall -willie -burke -heath -algorithm -integral -powder -elliott -brigadier -jackie -tate -varieties -darker -##cho -lately -cigarette -specimens -adds -##ensis -##inger -exploded -finalist -murders -wilderness -arguments -nicknamed -acceptance -onwards -manufacture -robertson -jets -tampa -enterprises -loudly -composers -nominations -1838 -malta -inquiry -automobile -hosting -viii -rays -tilted -grief -museums -strategies -furious -euro -equality -cohen -poison -surrey -wireless -governed -ridiculous -moses -##esh -vanished -barnes -attract -morrison -istanbul -##iness -absent -rotation -petition -janet -##logical -satisfaction -custody -deliberately -observatory -comedian -surfaces -pinyin -novelist -strictly -canterbury -oslo -monks -embrace -jealous -photograph -continent -dorothy -marina -excess -holden -allegations -explaining -stack -avoiding -lance -storyline -majesty -poorly -spike -bradford -raven -travis -classics -proven -voltage -pillow -fists -butt -1842 -interpreted -1839 -gage -telegraph -lens -promising -expelled -casual -collector -zones -silly -nintendo -##kh -downstairs -chef -suspicious -afl -flies -vacant -uganda -pregnancy -condemned -lutheran -estimates -cheap -decree -saxon -proximity -stripped -idiot -deposits -contrary -presenter -magnus -glacier -offense -edwin -##ori -upright -##long -bolt -##ois -toss -geographical -##izes -environments -delicate -marking -abstract -xavier -nails -windsor -plantation -occurring -equity -saskatchewan -fears -drifted -sequences -vegetation -revolt -##stic -1843 -sooner -fusion -opposing -nato -skating -1836 -secretly -ruin -lease -flora -anxiety -##ological -##mia -bout -taxi -emmy -frost -rainbow -compounds -foundations -rainfall -assassination -nightmare -dominican -achievements -deserve -orlando -intact -armenia -##nte -calgary -valentine -marion -proclaimed -theodore -bells -courtyard -thigh -gonzalez -console -troop -minimal -everyday -supporter -terrorism -buck -openly -presbyterian -activists -carpet -##iers -rubbing -uprising -cute -conceived -legally -##cht -millennium -cello -velocity -rescued -cardiff -1835 -rex -concentrate -senators -beard -rendered -glowing -battalions -scouts -competitors -sculptor -catalogue -arctic -ion -raja -bicycle -glancing -lawn -##woman -gentleman -lighthouse -publish -predicted -calculated -variants -##gne -strain -winston -deceased -touchdowns -brady -caleb -sinking -echoed -crush -hon -blessed -protagonist -hayes -endangered -magnitude -editors -##tine -estimate -responsibilities -##mel -backup -laying -consumed -sealed -zurich -lovers -frustrated -##eau -ahmed -kicking -treasurer -1832 -biblical -refuse -terrified -pump -agrees -genuine -imprisonment -refuses -plymouth -lou -##nen -tara -trembling -antarctic -ton -learns -##tas -crap -crucial -faction -atop -##borough -wrap -lancaster -odds -hopkins -erik -lyon -##eon -bros -snap -locality -empress -crowned -cal -acclaimed -chuckled -clara -sends -mild -towel -wishing -assuming -interviewed -##bal -interactions -eden -cups -helena -indie -beck -##fire -batteries -filipino -wizard -parted -traces -##born -rows -idol -albany -delegates -##ees -##sar -discussions -notre -instructed -belgrade -highways -suggestion -lauren -possess -orientation -alexandria -abdul -beats -salary -reunion -ludwig -alright -wagner -intimate -pockets -slovenia -hugged -brighton -merchants -cruel -stole -trek -slopes -repairs -enrollment -politically -underlying -promotional -counting -boeing -isabella -naming -keen -bacteria -listing -separately -belfast -ussr -lithuanian -anybody -ribs -sphere -martinez -cock -embarrassed -proposals -fragments -nationals -##wski -premises -fin -alpine -matched -freely -bounded -jace -sleeve -pier -populated -evident -##like -frances -flooded -##dle -frightened -pour -trainer -framed -visitor -challenging -pig -wickets -##fold -infected -##pes -arose -reward -ecuador -oblast -vale -shuttle -##usa -bach -rankings -forbidden -cornwall -accordance -salem -consumers -bruno -fantastic -toes -machinery -resolved -julius -remembering -propaganda -iceland -bombardment -tide -contacts -wives -##rah -concerto -macdonald -albania -implement -daisy -tapped -sudan -helmet -mistress -crop -sunk -finest -##craft -hostile -boxer -fr -paths -adjusted -habit -ballot -supervision -soprano -bullets -wicked -sunset -regiments -disappear -lamp -performs -##gia -rabbit -digging -incidents -entries -##cion -dishes -introducing -##ati -##fied -freshman -slot -jill -tackles -baroque -backs -##iest -lone -sponsor -destiny -altogether -convert -##aro -consensus -shapes -demonstration -basically -feminist -auction -artifacts -##bing -strongest -halifax -allmusic -mighty -smallest -precise -alexandra -viola -##los -##ille -manuscripts -##illo -dancers -ari -managers -monuments -blades -barracks -springfield -maiden -consolidated -electron -berry -airing -wheat -nobel -inclusion -blair -payments -geography -bee -eleanor -react -##hurst -afc -manitoba -lineup -fitness -recreational -investments -airborne -disappointment -##dis -edmonton -viewing -renovation -infant -bankruptcy -roses -aftermath -pavilion -carpenter -withdrawal -ladder -discussing -popped -reliable -agreements -rochester -##abad -curves -bombers -rao -reverend -decreased -choosing -stiff -consulting -naples -crawford -tracy -ribbon -cops -crushed -deciding -unified -teenager -accepting -flagship -poles -sanchez -inspection -revived -skilled -induced -exchanged -flee -locals -tragedy -swallow -hanna -demonstrate -##ela -salvador -flown -contestants -civilization -##ines -wanna -rhodes -fletcher -hector -knocking -considers -nash -mechanisms -sensed -mentally -walt -unclear -##eus -renovated -madame -crews -governmental -undertaken -monkey -##ben -##ato -fatal -armored -copa -caves -governance -grasp -perception -certification -froze -damp -tugged -wyoming -##rg -##ero -newman -nerves -curiosity -graph -##ami -withdraw -tunnels -dull -meredith -moss -exhibits -neighbors -communicate -accuracy -explored -raiders -republicans -secular -kat -superman -penny -criticised -freed -conviction -ham -likewise -delegation -gotta -doll -promises -technological -myth -nationality -resolve -convent -sharon -dig -sip -coordinator -entrepreneur -fold -##dine -capability -councillor -synonym -blown -swan -cursed -1815 -jonas -haired -sofa -canvas -keeper -rivalry -##hart -rapper -speedway -swords -postal -maxwell -estonia -potter -recurring -errors -##oni -cognitive -1834 -claws -nadu -roberto -bce -wrestler -ellie -infinite -ink -##tia -presumably -finite -staircase -noel -patricia -nacional -chill -eternal -tu -preventing -prussia -fossil -limbs -##logist -ernst -frog -perez -rene -prussian -##ios -molecules -regulatory -answering -opinions -sworn -lengths -supposedly -hypothesis -upward -habitats -seating -ancestors -drank -yield -synthesis -researcher -modest -##var -mothers -peered -voluntary -homeland -acclaim -##igan -static -valve -luxembourg -alto -carroll -receptor -norton -ambulance -##tian -johnston -catholics -depicting -jointly -elephant -gloria -mentor -badge -ahmad -distinguish -remarked -councils -precisely -allison -advancing -detection -crowded -cooperative -ankle -mercedes -dagger -surrendered -pollution -commit -subway -jeffrey -lesson -sculptures -provider -##fication -membrane -timothy -rectangular -fiscal -heating -teammate -basket -particle -anonymous -deployment -missiles -courthouse -proportion -shoe -sec -complaints -forbes -blacks -abandon -remind -sizes -overwhelming -autobiography -natalie -##awa -risks -contestant -countryside -babies -scorer -invaded -enclosed -proceed -hurling -disorders -##cu -reflecting -continuously -cruiser -graduates -freeway -investigated -ore -deserved -maid -blocking -phillip -jorge -shakes -dove -mann -variables -lacked -burden -accompanying -que -consistently -organizing -provisional -complained -endless -tubes -juice -georges -krishna -mick -thriller -laps -arcade -sage -snail -shannon -laurence -seoul -vacation -presenting -hire -churchill -surprisingly -prohibited -savannah -technically -##oli -##lessly -testimony -suited -speeds -toys -romans -flowering -measurement -talented -kay -settings -charleston -expectations -shattered -achieving -triumph -ceremonies -portsmouth -lanes -mandatory -loser -stretching -cologne -realizes -seventy -cornell -careers -webb -##ulating -americas -budapest -ava -suspicion -yo -conrad -sterling -jessie -rector -##az -1831 -transform -organize -loans -christine -volcanic -warrant -slender -summers -subfamily -newer -danced -dynamics -rhine -proceeds -heinrich -gastropod -commands -sings -facilitate -easter -positioned -responses -expense -fruits -yanked -imported -25th -velvet -vic -primitive -tribune -baldwin -neighbourhood -donna -rip -hay -##uro -1814 -espn -welcomed -##aria -qualifier -glare -highland -timing -##cted -shells -eased -geometry -louder -exciting -slovakia -##iz -savings -prairie -marching -rafael -tonnes -##lled -curtain -preceding -shy -heal -greene -worthy -##pot -detachment -bury -sherman -##eck -reinforced -seeks -bottles -contracted -duchess -outfit -walsh -mickey -geoffrey -archer -squeeze -dawson -eliminate -invention -##enberg -neal -##eth -stance -dealer -coral -maple -retire -simplified -1833 -hid -watts -backwards -jules -##oke -genesis -frames -rebounds -burma -woodland -moist -santos -whispers -drained -subspecies -streaming -ulster -burnt -correspondence -maternal -gerard -denis -stealing -genius -duchy -##oria -inaugurated -momentum -suits -placement -sovereign -clause -thames -##hara -confederation -reservation -sketch -yankees -lets -rotten -charm -hal -verses -commercially -dot -salon -citation -adopt -winnipeg -mist -allocated -cairo -jenkins -interference -objectives -##wind -1820 -portfolio -armoured -sectors -initiatives -integrity -exercises -robe -tap -gazed -##tones -distracted -rulers -favorable -jerome -tended -cart -factories -##eri -diplomat -valued -gravel -charitable -calvin -exploring -shepherd -terrace -pupil -##ural -reflects -##rch -governors -shelf -depths -##nberg -trailed -crest -tackle -##nian -hatred -##kai -clare -makers -ethiopia -longtime -detected -embedded -lacking -slapped -rely -thomson -anticipation -morton -successive -agnes -screenwriter -straightened -philippe -playwright -haunted -licence -iris -intentions -sutton -logical -correctly -##weight -branded -licked -tipped -silva -ricky -narrator -requests -##ents -greeted -supernatural -cow -##wald -lung -refusing -employer -strait -gaelic -liner -##piece -zoe -sabha -##mba -driveway -harvest -prints -bates -reluctantly -threshold -algebra -ira -wherever -coupled -assumption -picks -designers -raids -gentlemen -roller -blowing -leipzig -locks -screw -dressing -strand -##lings -scar -dwarf -depicts -##nu -nods -differ -boris -##eur -yuan -flip -##gie -mob -invested -questioning -applying -shout -##sel -gameplay -blamed -illustrations -bothered -weakness -rehabilitation -##zes -envelope -rumors -miners -leicester -subtle -kerry -ferguson -premiership -bengali -prof -catches -remnants -dana -##rily -shouting -presidents -baltic -ought -ghosts -dances -sailors -shirley -fancy -dominic -##bie -madonna -##rick -bark -buttons -gymnasium -ashes -liver -toby -oath -providence -doyle -evangelical -nixon -cement -carnegie -embarked -hatch -surroundings -guarantee -needing -pirate -essence -filter -crane -hammond -projected -immune -percy -twelfth -regent -doctoral -damon -mikhail -##ichi -critically -elect -realised -abortion -acute -screening -mythology -steadily -frown -nottingham -kirk -wa -minneapolis -##rra -module -algeria -nautical -encounters -surprising -statues -availability -shirts -pie -alma -brows -munster -mack -soup -crater -tornado -sanskrit -cedar -explosive -bordered -dixon -planets -stamp -exam -happily -##bble -carriers -kidnapped -accommodation -emigrated -##met -knockout -correspondent -violation -profits -peaks -lang -specimen -agenda -ancestry -pottery -spelling -equations -obtaining -ki -linking -1825 -debris -asylum -buddhism -##ants -gazette -dental -eligibility -fathers -averaged -zimbabwe -francesco -coloured -hissed -translator -lynch -mandate -humanities -mackenzie -uniforms -##iana -asset -fitting -samantha -genera -rim -beloved -shark -riot -entities -expressions -indo -carmen -slipping -owing -abbot -neighbor -sidney -rats -recommendations -encouraging -squadrons -anticipated -commanders -conquered -donations -diagnosed -divide -##iva -guessed -decoration -vernon -auditorium -revelation -conversations -##kers -##power -herzegovina -dash -alike -protested -lateral -herman -accredited -##gent -freeman -mel -fiji -crow -crimson -##rine -livestock -##pped -humanitarian -bored -oz -whip -##lene -##ali -legitimate -alter -grinning -spelled -anxious -oriental -wesley -##nin -##hole -carnival -controller -detect -##ssa -bowed -educator -kosovo -macedonia -##sin -occupy -mastering -stephanie -janeiro -para -unaware -nurses -noon -hopefully -ranger -combine -sociology -polar -rica -##eer -neill -##sman -holocaust -doubled -lust -1828 -decent -cooling -unveiled -1829 -nsw -homer -chapman -meyer -dive -mae -reagan -expertise -##gled -darwin -brooke -sided -prosecution -investigating -comprised -petroleum -genres -reluctant -differently -trilogy -johns -vegetables -corpse -highlighted -lounge -pension -unsuccessfully -elegant -aided -ivory -beatles -amelia -cain -dubai -immigrant -babe -underwater -combining -mumbled -atlas -horns -accessed -ballad -physicians -homeless -gestured -rpm -freak -louisville -corporations -patriots -prizes -rational -warn -modes -decorative -overnight -din -troubled -phantom -monarch -sheer -##dorf -generals -guidelines -organs -addresses -enhance -curling -parishes -cord -##kie -caesar -deutsche -bavaria -coleman -cyclone -##eria -bacon -petty -##yama -##old -hampton -diagnosis -1824 -throws -complexity -rita -disputed -pablo -marketed -trafficking -##ulus -examine -plague -formats -vault -faithful -##bourne -webster -highlights -##ient -phones -vacuum -sandwich -modeling -##gated -bolivia -clergy -qualities -isabel -##nas -##ars -wears -screams -reunited -annoyed -bra -##ancy -##rate -differential -transmitter -tattoo -container -poker -##och -excessive -resides -cowboys -##tum -augustus -trash -providers -statute -retreated -balcony -reversed -void -storey -preceded -masses -leap -laughs -neighborhoods -wards -schemes -falcon -santo -battlefield -ronnie -lesbian -venus -##dian -beg -sandstone -daylight -punched -gwen -analog -stroked -wwe -acceptable -measurements -toxic -##kel -adequate -surgical -economist -parameters -varsity -##sberg -quantity -##chy -##rton -countess -generating -precision -diamonds -expressway -##ı -1821 -uruguay -talents -galleries -expenses -scanned -colleague -outlets -ryder -lucien -##ila -paramount -syracuse -dim -fangs -gown -sweep -##sie -missionaries -websites -sentences -adviser -val -trademark -spells -##plane -patience -starter -slim -##borg -toe -incredibly -shoots -elliot -nobility -##wyn -cowboy -endorsed -gardner -tendency -persuaded -organisms -emissions -kazakhstan -amused -boring -chips -themed -##hand -constantinople -chasing -systematic -guatemala -borrowed -erin -carey -##hard -highlands -struggles -1810 -##ifying -##ced -exceptions -develops -enlarged -kindergarten -castro -##rina -leigh -zombie -juvenile -##most -consul -sailor -hyde -clarence -intensive -pinned -nasty -useless -jung -clayton -stuffed -exceptional -ix -apostolic -transactions -exempt -swinging -cove -religions -shields -dairy -bypass -pursuing -joyce -bombay -chassis -southampton -chat -interact -redesignated -##pen -nascar -pray -salmon -rigid -regained -malaysian -grim -publicity -constituted -capturing -toilet -delegate -purely -tray -drift -loosely -striker -weakened -trinidad -mitch -itv -defines -transmitted -scarlet -nodding -fitzgerald -narrowly -tooth -standings -virtue -##wara -##cting -chateau -gloves -lid -hurting -conservatory -##pel -sinclair -reopened -sympathy -nigerian -strode -advocated -optional -chronic -discharge -suck -compatible -laurel -stella -fails -wage -dodge -informal -sorts -levi -buddha -villagers -chronicles -heavier -summoned -gateway -eleventh -jewelry -translations -accordingly -seas -##ency -fiber -pyramid -cubic -dragging -##ista -caring -##ops -contacted -lunar -lisbon -patted -1826 -sacramento -theft -madagascar -subtropical -disputes -holidays -piper -willow -mare -cane -newfoundland -benny -companions -dong -raj -observe -roar -charming -plaque -tibetan -fossils -enacted -manning -bubble -tanzania -##eda -##hir -funk -swamp -deputies -cloak -ufc -scenario -par -scratch -metals -anthem -guru -engaging -specially -##boat -dialects -nineteen -cecil -duet -disability -unofficial -##lies -defunct -moonlight -drainage -surname -puzzle -switching -conservatives -mammals -knox -broadcaster -sidewalk -cope -##ried -benson -princes -peterson -##sal -bedford -sharks -eli -wreck -alberto -gasp -archaeology -lgbt -teaches -securities -madness -compromise -waving -coordination -davidson -visions -leased -possibilities -eighty -fernandez -enthusiasm -assassin -sponsorship -reviewer -kingdoms -estonian -laboratories -##fy -##nal -applies -verb -celebrations -##zzo -rowing -lightweight -sadness -submit -balanced -dude -explicitly -metric -magnificent -mound -brett -mohammad -mistakes -irregular -sanders -betrayed -shipped -surge -##enburg -reporters -termed -georg -pity -verbal -bulls -abbreviated -enabling -appealed -sicily -sting -heel -sweetheart -bart -spacecraft -brutal -monarchy -aberdeen -cameo -diane -survivor -clyde -##aries -complaint -##makers -clarinet -delicious -chilean -karnataka -coordinates -1818 -panties -##rst -pretending -dramatically -kiev -tends -distances -catalog -launching -instances -telecommunications -portable -lindsay -vatican -##eim -angles -aliens -marker -stint -screens -bolton -##rne -judy -wool -benedict -plasma -europa -imaging -filmmaker -swiftly -contributor -opted -stamps -apologize -financing -butter -gideon -sophisticated -alignment -avery -chemicals -yearly -speculation -prominence -professionally -immortal -institutional -inception -wrists -identifying -tribunal -derives -gains -papal -preference -linguistic -vince -operative -brewery -##ont -unemployment -boyd -##ured -##outs -albeit -prophet -1813 -##rad -quarterly -asteroid -cleaned -radius -temper -##llen -telugu -jerk -viscount -##ote -glimpse -##aya -yacht -hawaiian -baden -laptop -readily -##gu -monetary -offshore -scots -watches -##yang -##arian -upgrade -needle -lea -encyclopedia -flank -fingertips -delight -teachings -confirm -roth -beaches -midway -winters -##iah -teasing -daytime -beverly -gambling -##backs -regulated -clement -hermann -tricks -knot -##shing -##uring -##vre -detached -ecological -owed -specialty -byron -inventor -bats -stays -screened -unesco -midland -trim -affection -##ander -jess -thoroughly -feedback -chennai -strained -heartbeat -wrapping -overtime -pleaded -##sworth -leisure -oclc -##tate -##ele -feathers -angelo -thirds -nuts -surveys -clever -gill -commentator -##dos -darren -rides -gibraltar -dissolution -dedication -shin -meals -saddle -elvis -reds -chaired -taller -appreciation -functioning -niece -favored -advocacy -robbie -criminals -suffolk -yugoslav -passport -constable -congressman -hastings -##rov -consecrated -sparks -ecclesiastical -confined -##ovich -muller -floyd -nora -1822 -paved -1827 -cumberland -ned -saga -spiral -appreciated -collaborative -treating -similarities -feminine -finishes -##ib -jade -import -##hot -champagne -mice -securing -celebrities -helsinki -attributes -##gos -cousins -phases -ache -lucia -gandhi -submission -vicar -spear -shine -tasmania -biting -detention -constitute -tighter -seasonal -##gus -terrestrial -matthews -effectiveness -parody -philharmonic -##onic -1816 -strangers -encoded -consortium -guaranteed -regards -shifts -tortured -collision -supervisor -inform -broader -insight -theaters -armour -emeritus -blink -incorporates -mapping -handball -flexible -##nta -substantially -generous -thief -carr -loses -1793 -prose -ucla -romeo -generic -metallic -realization -damages -commissioners -zach -default -helicopters -lengthy -stems -partnered -spectators -rogue -indication -penalties -teresa -1801 -sen -##tric -dalton -##wich -irving -photographic -##vey -deaf -peters -excluded -unsure -##vable -patterson -crawled -##zio -resided -whipped -latvia -slower -ecole -pipes -employers -maharashtra -comparable -textile -pageant -##gel -alphabet -binary -irrigation -chartered -choked -antoine -offs -waking -supplement -quantities -demolition -regain -locate -urdu -folks -scary -andreas -whites -##ava -classrooms -mw -aesthetic -publishes -valleys -guides -cubs -johannes -bryant -conventions -affecting -##itt -drain -awesome -isolation -prosecutor -ambitious -apology -captive -downs -atmospheric -lorenzo -aisle -beef -foul -##onia -kidding -composite -disturbed -illusion -natives -##ffer -rockets -riverside -wartime -painters -adolf -melted -uncertainty -simulation -hawks -progressed -meantime -builder -spray -breach -unhappy -regina -russians -determining -tram -1806 -##quin -aging -1823 -garion -rented -mister -diaz -terminated -clip -1817 -depend -nervously -disco -owe -defenders -shiva -notorious -disbelief -shiny -worcester -##gation -##yr -trailing -undertook -islander -belarus -limitations -watershed -fuller -overlooking -utilized -raphael -1819 -synthetic -breakdown -klein -##nate -moaned -memoir -lamb -practicing -##erly -cellular -arrows -exotic -witches -charted -rey -hut -hierarchy -subdivision -freshwater -giuseppe -aloud -reyes -qatar -marty -sideways -utterly -sexually -jude -prayers -mccarthy -softball -blend -damien -##gging -##metric -wholly -erupted -lebanese -negro -revenues -tasted -comparative -teamed -transaction -labeled -maori -sovereignty -parkway -trauma -gran -malay -advancement -descendant -buzz -salvation -inventory -symbolic -##making -antarctica -mps -##bro -mohammed -myanmar -holt -submarines -tones -##lman -locker -patriarch -bangkok -emerson -remarks -predators -kin -afghan -confession -norwich -rental -emerge -advantages -##zel -rca -##hold -shortened -storms -aidan -##matic -autonomy -compliance -##quet -dudley -##osis -1803 -motto -documentation -summary -professors -spectacular -christina -archdiocese -flashing -innocence -remake -##dell -psychic -reef -scare -employ -sticks -meg -gus -leans -accompany -bergen -tomas -doom -wages -pools -##bes -breasts -scholarly -alison -outline -brittany -breakthrough -willis -realistic -##cut -##boro -competitor -##stan -pike -picnic -designing -commercials -washing -villain -skiing -costumes -auburn -halted -executives -logistics -cycles -vowel -applicable -barrett -exclaimed -eurovision -eternity -ramon -##umi -modifications -sweeping -disgust -torch -aviv -ensuring -rude -dusty -sonic -donovan -outskirts -cu -pathway -##band -##gun -disciplines -acids -cadet -paired -sketches -##sive -marriages -folding -peers -slovak -implies -admired -##beck -1880s -leopold -instinct -attained -weston -megan -horace -##ination -dorsal -ingredients -evolutionary -complications -deity -lethal -brushing -levy -deserted -institutes -posthumously -delivering -telescope -coronation -motivated -rapids -luc -flicked -pays -volcano -tanner -weighed -##nica -crowds -frankie -gifted -addressing -granddaughter -winding -##rna -constantine -gomez -##front -landscapes -rudolf -anthropology -slate -werewolf -astronomy -circa -rouge -dreaming -sack -knelt -drowned -naomi -prolific -tracked -freezing -herb -agony -randall -twisting -wendy -deposit -touches -vein -wheeler -##bbled -batted -retaining -tire -presently -compare -specification -daemon -nigel -##grave -merry -recommendation -czechoslovakia -sandra -roma -##sts -lambert -inheritance -sheikh -winchester -cries -examining -##yle -comeback -cuisine -nave -##iv -retrieve -tomatoes -barker -polished -defining -irene -lantern -personalities -begging -tract -swore -1809 -##gic -omaha -brotherhood -haiti -##ots -exeter -##ete -##zia -steele -dumb -pearson -surveyed -elisabeth -trends -fritz -bugs -fraction -calmly -viking -##birds -tug -inserted -unusually -##ield -confronted -distress -crashing -brent -turks -resign -##olo -cambodia -gabe -sauce -##kal -evelyn -extant -clusters -quarry -teenagers -luna -##lers -##ister -affiliation -drill -##ashi -panthers -scenic -libya -anita -strengthen -inscriptions -##cated -lace -sued -judith -riots -##uted -mint -##eta -preparations -midst -dub -challenger -##vich -mock -displaced -wicket -breaths -enables -schmidt -analyst -##lum -highlight -automotive -axe -josef -newark -sufficiently -resembles -50th -##pal -flushed -mum -traits -##ante -commodore -incomplete -warming -titular -ceremonial -ethical -celebrating -eighteenth -cao -lima -medalist -mobility -strips -snakes -miniature -zagreb -barton -escapes -umbrella -automated -doubted -differs -cooled -georgetown -dresden -cooked -fade -wyatt -jacobs -carlton -abundant -stereo -madras -inning -spur -malayalam -begged -osaka -groan -escaping -charging -dose -##aj -bud -papa -communists -advocates -edged -tri -resemble -peaking -necklace -fried -montenegro -saxony -goose -glances -stuttgart -curator -recruit -grocery -sympathetic -##tting -##fort -lotus -randolph -ancestor -##rand -succeeding -jupiter -1798 -macedonian -##heads -hiking -1808 -handing -fischer -##itive -garbage -##pies -prone -singular -papua -inclined -attractions -italia -pouring -motioned -grandma -garnered -jacksonville -corp -ego -ringing -aluminum -##hausen -ordering -##foot -drawer -traders -synagogue -##kawa -resistant -wandering -fragile -fiona -teased -hardcore -soaked -jubilee -decisive -exposition -mercer -poster -valencia -hale -kuwait -1811 -##ises -##wr -##eed -tavern -gamma -johan -##uer -airways -amino -gil -vocational -domains -torres -generator -folklore -outcomes -##keeper -canberra -shooter -fl -beams -confrontation -##gram -aligned -forestry -pipeline -jax -motorway -conception -decay -coffin -##cott -stalin -1805 -escorted -minded -##nam -sitcom -purchasing -twilight -veronica -additions -passive -tensions -straw -frequencies -1804 -refugee -cultivation -##iate -christie -clary -bulletin -crept -disposal -##rich -##zong -processor -crescent -##rol -emphasized -whale -nazis -aurora -dwelling -hauled -sponsors -toledo -ideology -theatres -tessa -cerambycidae -saves -turtle -cone -suspects -kara -rusty -yelling -greeks -mozart -shades -cocked -participant -shire -spit -freeze -necessity -##cos -inmates -nielsen -councillors -loaned -uncommon -omar -peasants -botanical -offspring -daniels -formations -jokes -1794 -pioneers -sigma -licensing -##sus -wheelchair -polite -1807 -liquor -pratt -trustee -##uta -forewings -balloon -kilometre -camping -explicit -casually -shawn -foolish -teammates -nm -hassan -carrie -judged -satisfy -vanessa -knives -selective -flowed -##lice -stressed -eliza -mathematician -cease -cultivated -##roy -commissions -browns -##ania -destroyers -sheridan -meadow -##rius -minerals -##cial -downstream -clash -gram -memoirs -ventures -baha -seymour -archie -midlands -edith -fare -flynn -invite -canceled -tiles -stabbed -boulder -incorporate -amended -camden -facial -mollusk -unreleased -descriptions -grabs -raises -ramp -shiver -##rose -coined -pioneering -tunes -qing -warwick -tops -melanie -giles -##rous -wandered -##inal -annexed -30th -unnamed -##ished -organizational -airplane -normandy -stoke -whistle -blessing -violations -chased -holders -shotgun -##ctic -reactor -##vik -tires -tearing -shores -fortified -mascot -constituencies -columnist -productive -tibet -##rta -lineage -hooked -tapes -judging -cody -##gger -hansen -kashmir -triggered -##eva -solved -cliffs -##tree -resisted -anatomy -protesters -transparent -implied -##iga -injection -mattress -excluding -##mbo -defenses -helpless -devotion -##elli -growl -liberals -weber -phenomena -atoms -plug -##iff -mortality -apprentice -howe -convincing -swimmer -barber -leone -promptly -sodium -def -nowadays -arise -##oning -gloucester -corrected -dignity -norm -erie -##ders -elders -evacuated -compression -##yar -hartford -backpack -reasoning -accepts -24th -wipe -millimetres -marcel -##oda -dodgers -albion -1790 -overwhelmed -aerospace -oaks -1795 -showcase -acknowledge -recovering -nolan -ashe -hurts -geology -fashioned -disappearance -farewell -swollen -shrug -marquis -wimbledon -rue -1792 -commemorate -reduces -experiencing -inevitable -calcutta -##court -murderer -sticking -fisheries -imagery -bloom -##inus -gustav -hesitation -memorable -viral -beans -accidents -tunisia -antenna -spilled -consort -treatments -aye -perimeter -##gard -donation -hostage -migrated -banker -addiction -apex -lil -trout -##ously -conscience -##nova -rams -sands -genome -passionate -troubles -##lets -amid -##ibility -##ret -higgins -exceed -vikings -##vie -payne -##zan -muscular -defendant -sucking -##wal -ibrahim -fuselage -claudia -vfl -europeans -snails -interval -##garh -preparatory -statewide -tasked -lacrosse -viktor -##lation -angola -##hra -flint -implications -employs -teens -patrons -stall -weekends -barriers -scrambled -nucleus -tehran -jenna -parsons -lifelong -robots -displacement -##bles -precipitation -knuckles -clutched -1802 -marrying -ecology -marx -accusations -declare -scars -kolkata -mat -meadows -bermuda -skeleton -finalists -vintage -crawl -coordinate -affects -subjected -orchestral -mistaken -mirrors -dipped -relied -arches -candle -##nick -incorporating -wildly -fond -basilica -owl -fringe -rituals -whispering -stirred -feud -tertiary -slick -goat -honorable -whereby -ricardo -stripes -parachute -adjoining -submerged -synthesizer -##gren -intend -positively -ninety -phi -beaver -partition -fellows -alexis -prohibition -carlisle -bizarre -fraternity -doubts -icy -aquatic -sneak -sonny -combines -airports -crude -supervised -spatial -merge -alfonso -##bic -corrupt -scan -undergo -##ams -disabilities -colombian -comparing -dolphins -perkins -reprinted -unanimous -bounced -hairs -underworld -midwest -semester -bucket -paperback -miniseries -coventry -demise -##leigh -demonstrations -sensor -rotating -yan -##hler -arrange -soils -##idge -hyderabad -labs -brakes -grandchildren -##nde -negotiated -rover -ferrari -continuation -directorate -augusta -stevenson -counterpart -gore -##rda -nursery -rican -ave -collectively -broadly -pastoral -repertoire -asserted -discovering -nordic -styled -fiba -cunningham -harley -middlesex -survives -tumor -tempo -zack -aiming -lok -urgent -##nto -devils -contractor -turin -##wl -bliss -repaired -simmons -moan -astronomical -negotiate -lyric -1890s -lara -bred -clad -angus -pbs -engineered -posed -hernandez -possessions -elbows -psychiatric -strokes -confluence -electorate -lifts -campuses -lava -alps -##ution -##date -physicist -woody -##ographic -##itis -juliet -reformation -sparhawk -complement -suppressed -jewel -##½ -floated -##kas -continuity -sadly -##ische -inability -melting -scanning -paula -flour -judaism -safer -vague -solving -curb -##stown -financially -gable -bees -expired -miserable -cassidy -dominion -1789 -cupped -robbery -facto -amos -warden -resume -tallest -marvin -pounded -declaring -gasoline -##aux -darkened -sophomore -##mere -erection -gossip -televised -risen -dial -##eu -pillars -passages -profound -arabian -ashton -silicon -nail -##lated -##hardt -fleming -firearms -ducked -circuits -blows -waterloo -titans -fireplace -cheshire -financed -activation -algorithms -constituent -catcher -cherokee -partnerships -sexuality -platoon -tragic -vivian -guarded -whiskey -meditation -poetic -##nga -porto -listeners -dominance -kendra -mona -chandler -factions -22nd -salisbury -attitudes -derivative -##ido -##haus -intake -paced -javier -illustrator -barrels -bias -cockpit -burnett -dreamed -ensuing -receptors -someday -hawkins -mattered -##lal -slavic -1799 -jesuit -cameroon -wasted -wax -lowering -victorious -freaking -outright -hancock -librarian -sensing -bald -calcium -myers -tablet -announcing -barack -shipyard -pharmaceutical -greenwich -flush -medley -patches -wolfgang -speeches -acquiring -exams -nikolai -hayden -kannada -reilly -waitress -abdomen -devastated -capped -pseudonym -pharmacy -fulfill -paraguay -1796 -clicked -##trom -archipelago -syndicated -##hman -lumber -orgasm -rejection -clifford -lorraine -advent -mafia -rodney -brock -##used -##elia -cassette -chamberlain -despair -mongolia -sensors -developmental -upstream -##alis -spanning -trombone -basque -seeded -interred -renewable -rhys -leapt -revision -molecule -##ages -chord -vicious -nord -shivered -23rd -arlington -debts -corpus -sunrise -bays -blackburn -centimetres -##uded -shuddered -strangely -gripping -cartoons -isabelle -orbital -##ppa -seals -proving -refusal -strengthened -bust -assisting -baghdad -batsman -portrayal -mara -pushes -spears -og -##cock -reside -nathaniel -brennan -1776 -confirmation -caucus -##worthy -markings -yemen -nobles -ku -lazy -viewer -catalan -encompasses -sawyer -##fall -sparked -substances -patents -braves -arranger -evacuation -sergio -persuade -dover -tolerance -penguin -cum -jockey -insufficient -townships -occupying -declining -plural -processed -projection -puppet -flanders -introduces -liability -##yon -gymnastics -antwerp -hobart -candles -jeep -wes -observers -chaplain -bundle -glorious -##hine -hazel -flung -sol -excavations -dumped -stares -bangalore -triangular -icelandic -intervals -expressing -turbine -##vers -songwriting -crafts -##igo -jasmine -ditch -rite -entertaining -comply -sorrow -wrestlers -basel -emirates -marian -rivera -helpful -##some -caution -downward -networking -##atory -##tered -darted -genocide -emergence -replies -specializing -spokesman -convenient -unlocked -fading -augustine -concentrations -resemblance -elijah -investigator -andhra -##uda -promotes -##rrell -fleeing -simone -announcer -lydia -weaver -residency -modification -##fest -stretches -alternatively -nat -lowe -lacks -##ented -pam -tile -concealed -inferior -abdullah -residences -tissues -vengeance -##ided -moisture -peculiar -groove -bologna -jennings -ninja -oversaw -zombies -pumping -batch -livingston -emerald -installations -1797 -peel -nitrogen -rama -##fying -schooling -strands -responding -werner -lime -casa -accurately -targeting -##rod -underway -##uru -hemisphere -lester -##yard -occupies -griffith -angrily -reorganized -##owing -courtney -deposited -estadio -##ifies -dunn -exiled -##ying -checks -##combe -successes -unexpectedly -blu -assessed -##flower -observing -sacked -spiders -kn -nodes -prosperity -audrey -divisional -broncos -tangled -adjust -feeds -erosion -paolo -surf -directory -snatched -humid -admiralty -screwed -reddish -##nese -modules -trench -lamps -bind -leah -bucks -competes -##nz -transcription -isles -violently -clutching -pga -cyclist -inflation -flats -ragged -unnecessary -##hian -stubborn -coordinated -harriet -baba -disqualified -insect -wolfe -##fies -reinforcements -rocked -duel -winked -embraced -bricks -##raj -hiatus -defeats -pending -brightly -jealousy -##xton -##uki -lena -colorful -##dley -stein -kidney -##shu -underwear -wanderers -##haw -##icus -guardians -m³ -roared -habits -##wise -permits -uranium -punished -disguise -bundesliga -elise -dundee -erotic -partisan -collectors -float -individually -rendering -behavioral -bucharest -ser -hare -valerie -corporal -nutrition -proportional -immense -##kis -pavement -##zie -##eld -sutherland -crouched -1775 -suzuki -trades -endurance -operas -crosby -prayed -priory -rory -socially -gujarat -walton -cube -pasha -privilege -lennon -floods -thorne -waterfall -nipple -scouting -approve -##lov -minorities -voter -dwight -extensions -assure -ballroom -slap -dripping -privileges -rejoined -confessed -demonstrating -patriotic -yell -investor -##uth -pagan -slumped -squares -confront -bert -embarrassment -aston -urging -sweater -starr -yuri -brains -williamson -commuter -mortar -structured -selfish -exports -##jon -cds -##him -unfinished -##rre -mortgage -destinations -##nagar -canoe -solitary -buchanan -delays -magistrate -fk -##pling -motivation -##lier -##vier -recruiting -assess -##mouth -malik -antique -1791 -pius -rahman -reich -tub -zhou -smashed -airs -galway -xii -conditioning -honduras -discharged -dexter -##pf -lionel -debates -lemon -volunteered -dioxide -procession -devi -sic -tremendous -advertisements -colts -transferring -verdict -hanover -decommissioned -utter -relate -pac -racism -beacon -limp -similarity -terra -occurrence -ant -becky -capt -updates -armament -richie -pal -##graph -halloween -mayo -##ssen -##bone -cara -serena -fcc -dolls -obligations -##dling -violated -lafayette -jakarta -exploitation -infamous -iconic -##lah -##park -moody -reginald -dread -spill -crystals -olivier -modeled -bluff -equilibrium -separating -notices -ordnance -extinction -onset -cosmic -attachment -sammy -expose -privy -anchored -##bil -abbott -admits -bending -baritone -emmanuel -policeman -vaughan -winged -climax -dresses -denny -polytechnic -mohamed -burmese -authentic -nikki -genetics -grandparents -homestead -gaza -postponed -metacritic -una -##sby -unstable -dissertation -##cian -curls -obscure -uncovered -bronx -praying -disappearing -##hoe -prehistoric -coke -turret -mutations -nonprofit -pits -monaco -##usion -prominently -dispatched -podium -##mir -uci -##uation -fortifications -birthplace -kendall -##lby -##oll -preacher -rack -goodman -persistent -##ott -countless -jaime -recorder -lexington -persecution -jumps -renewal -wagons -crushing -##holder -decorations -##lake -abundance -wrath -laundry -£1 -garde -jeanne -beetles -peasant -splitting -caste -sergei -##rer -##ema -scripts -##ively -rub -satellites -##vor -inscribed -verlag -scrapped -gale -packages -chick -potato -slogan -kathleen -arabs -##culture -counterparts -reminiscent -choral -##tead -rand -retains -bushes -dane -accomplish -courtesy -closes -##oth -slaughter -hague -krakow -lawson -tailed -elias -ginger -##ttes -canopy -betrayal -rebuilding -turf -##hof -frowning -allegiance -brigades -kicks -rebuild -polls -alias -nationalism -rowan -audition -bowie -fortunately -recognizes -harp -dillon -horrified -##oro -renault -ropes -presumed -rewarded -infrared -wiping -accelerated -illustration -presses -practitioners -badminton -##iard -detained -##tera -recognizing -relates -misery -##sies -##tly -reproduction -piercing -potatoes -thornton -esther -manners -hbo -##aan -ours -bullshit -ernie -perennial -sensitivity -illuminated -rupert -##iss -rfc -nassau -##dock -staggered -socialism -##haven -appointments -nonsense -prestige -sharma -haul -solidarity -##rata -igor -pedestrian -##uit -baxter -tenants -wires -medication -unlimited -guiding -impacts -diabetes -##rama -sasha -pas -clive -extraction -continually -constraints -##bilities -sonata -hunted -sixteenth -chu -planting -quote -mayer -pretended -spat -ceramic -##cci -curtains -pigs -pitching -##dad -latvian -sore -dayton -##sted -patrols -slice -playground -##nted -shone -stool -apparatus -inadequate -mates -treason -##ija -desires -##liga -##croft -somalia -laurent -mir -grape -obliged -chevrolet -thirteenth -stunning -enthusiastic -##ede -accounted -concludes -currents -basil -##kovic -drought -##rica -mai -##aire -shove -posting -##shed -pilgrimage -humorous -packing -fry -pencil -wines -smells -marilyn -aching -newest -clung -bon -neighbours -sanctioned -##pie -mug -##stock -drowning -hydraulic -##vil -hiring -reminder -lilly -investigators -##ncies -sour -##eous -compulsory -packet -##rion -##graphic -##elle -cannes -##inate -depressed -##rit -heroic -importantly -theresa -##tled -conway -saturn -marginal -rae -##xia -corresponds -royce -pact -jasper -explosives -packaging -aluminium -##ttered -denotes -rhythmic -spans -assignments -hereditary -outlined -originating -sundays -lad -reissued -greeting -beatrice -##dic -pillar -marcos -plots -handbook -alcoholic -judiciary -avant -slides -extract -masculine -blur -##eum -homage -trembled -owens -hymn -trey -signaling -socks -accumulated -reacted -attic -theo -lining -angie -distraction -primera -talbot -creativity -billed -##hey -deacon -eduardo -identifies -proposition -dizzy -gunner -hogan -##yam -##pping -##hol -ja -##chan -jensen -reconstructed -##berger -clearance -darius -##nier -abe -harlem -plea -dei -circled -emotionally -notation -fascist -neville -exceeded -upwards -viable -ducks -workforce -racer -limiting -shri -##lson -possesses -kerr -moths -devastating -laden -disturbing -locking -gal -fearing -accreditation -flavor -aide -1870s -mountainous -##baum -melt -##ures -texture -servers -soda -herd -##nium -erect -puzzled -hum -peggy -examinations -gould -testified -geoff -ren -devised -sacks -##law -denial -posters -grunted -cesar -tutor -gerry -offerings -byrne -falcons -combinations -incoming -pardon -rocking -26th -avengers -flared -mankind -seller -uttar -loch -nadia -stroking -exposing -fertile -ancestral -instituted -##has -noises -prophecy -taxation -eminent -vivid -pol -##bol -dart -indirect -multimedia -notebook -upside -displaying -adrenaline -referenced -geometric -##iving -progression -##ddy -blunt -announce -##far -implementing -##lav -aggression -liaison -cooler -cares -headache -plantations -gorge -dots -impulse -thickness -ashamed -averaging -kathy -obligation -precursor -fowler -symmetry -thee -hears -##rai -undergoing -butcher -bowler -##lip -cigarettes -subscription -goodness -##ically -browne -##hos -kyoto -donor -##erty -damaging -friction -drifting -expeditions -hardened -prostitution -fauna -blankets -claw -tossing -snarled -butterflies -recruits -investigative -coated -healed -communal -hai -xiii -academics -boone -psychologist -restless -lahore -stephens -brendan -foreigners -printer -ached -explode -27th -deed -scratched -dared -##pole -cardiac -1780 -okinawa -proto -commando -compelled -oddly -electrons -replica -thanksgiving -##rist -sheila -deliberate -stafford -tidal -representations -hercules -ou -##path -##iated -kidnapping -lenses -##tling -deficit -samoa -mouths -consuming -computational -maze -granting -smirk -razor -fixture -ideals -inviting -aiden -nominal -issuing -julio -pitt -ramsey -docks -##oss -exhaust -##owed -bavarian -draped -anterior -mating -ethiopian -explores -noticing -##nton -discarded -convenience -hoffman -endowment -beasts -cartridge -mormon -paternal -probe -sleeves -interfere -lump -deadline -jenks -bulldogs -scrap -alternating -justified -reproductive -nam -seize -descending -secretariat -kirby -grouped -smash -panther -sedan -tapping -lola -cheer -germanic -unfortunate -##eter -unrelated -##fan -subordinate -##sdale -suzanne -advertisement -##ility -horsepower -##lda -cautiously -discourse -luigi -##mans -##fields -noun -prevalent -mao -schneider -everett -surround -governorate -kira -##avia -westward -##take -misty -rails -sustainability -unused -##rating -packs -toast -unwilling -regulate -thy -suffrage -nile -awe -assam -definitions -travelers -affordable -##rb -conferred -sells -undefeated -beneficial -torso -basal -repeating -remixes -bahrain -cables -fang -##itated -excavated -numbering -statutory -deluxe -##lian -forested -ramirez -derbyshire -zeus -slamming -transfers -astronomer -banana -lottery -berg -histories -bamboo -##uchi -resurrection -posterior -bowls -vaguely -##thi -thou -preserving -tensed -offence -##inas -meyrick -callum -ridden -watt -langdon -tying -lowland -snorted -daring -truman -##hale -##girl -aura -overly -filing -weighing -goa -infections -philanthropist -saunders -eponymous -##owski -latitude -perspectives -reviewing -mets -commandant -radial -##kha -flashlight -reliability -koch -vowels -amazed -ada -elaine -supper -##encies -predator -debated -soviets -cola -##boards -##nah -compartment -crooked -arbitrary -fourteenth -havana -majors -steelers -clips -profitable -ambush -exited -packers -##tile -nude -cracks -fungi -limb -trousers -josie -shelby -tens -frederic -##ος -definite -smoothly -constellation -insult -baton -discs -lingering -##nco -conclusions -lent -staging -becker -grandpa -shaky -##tron -einstein -obstacles -adverse -economically -##moto -mccartney -thor -dismissal -motions -readings -nostrils -treatise -##pace -squeezing -evidently -prolonged -1783 -venezuelan -je -marguerite -beirut -takeover -shareholders -##vent -denise -digit -airplay -norse -##bbling -imaginary -pills -hubert -blaze -vacated -eliminating -vine -mansfield -retrospective -barrow -borne -clutch -bail -forensic -weaving -##nett -##witz -desktop -citadel -promotions -worrying -dorset -subdivided -##iating -manned -expeditionary -pickup -synod -chuckle -barney -##rz -##ffin -functionality -karachi -litigation -meanings -lick -anders -##ffed -execute -curl -oppose -ankles -typhoon -##ache -linguistics -compassion -pressures -grazing -perfection -##iting -immunity -monopoly -muddy -backgrounds -namibia -francesca -monitors -attracting -stunt -tuition -##ии -vegetable -##mates -##quent -mgm -jen -complexes -forts -cellar -bites -seventeenth -royals -flemish -failures -mast -charities -##cular -peruvian -capitals -macmillan -ipswich -outward -frigate -postgraduate -folds -employing -##ouse -concurrently -fiery -##tai -contingent -nightmares -monumental -nicaragua -##kowski -lizard -mal -fielding -gig -reject -harding -##ipe -coastline -##cin -beethoven -humphrey -innovations -##tam -norris -doris -solicitor -obey -niagara -shelves -bourbon -nightclub -specifications -hilton -##ndo -centennial -dispersed -worm -neglected -briggs -kuala -uneasy -##nstein -##bound -##aking -##burgh -awaiting -pronunciation -##bbed -##quest -eh -optimal -zhu -raped -greens -presided -brenda -worries -venetian -marxist -turnout -##lius -refined -braced -sins -grasped -sunderland -nickel -speculated -lowell -cyrillic -communism -fundraising -resembling -colonists -mutant -freddie -usc -##mos -gratitude -##run -mural -##lous -chemist -reminds -28th -steals -tess -pietro -##ingen -promoter -ri -microphone -honoured -rai -sant -##qui -feather -##nson -burlington -kurdish -terrorists -deborah -sickness -##wed -hazard -irritated -desperation -veil -clarity -##rik -jewels -xv -##gged -##ows -##cup -berkshire -unfair -mysteries -orchid -winced -exhaustion -renovations -stranded -obe -infinity -##nies -adapt -redevelopment -thanked -registry -olga -domingo -noir -tudor -ole -commenting -behaviors -##ais -crisp -pauline -probable -stirling -wigan -paralympics -panting -surpassed -##rew -luca -barred -famed -##sters -cassandra -waiter -carolyn -exported -##orted -andres -destructive -deeds -jonah -castles -vacancy -##glass -1788 -orchard -yep -famine -belarusian -sprang -##forth -skinny -##mis -administrators -rotterdam -zambia -zhao -boiler -discoveries -##ride -##physics -lucius -disappointing -outreach -spoon -##frame -qualifications -unanimously -enjoys -regency -##iidae -stade -realism -veterinary -rodgers -dump -alain -chestnut -castile -censorship -rumble -gibbs -communion -reggae -inactivated -logs -loads -##houses -homosexual -##iano -ale -informs -##cas -phrases -plaster -linebacker -ambrose -kaiser -fascinated -limerick -recruitment -forge -mastered -##nding -leinster -rooted -threaten -##strom -borneo -##hes -suggestions -scholarships -propeller -documentaries -patronage -coats -constructing -invest -neurons -comet -entirety -shouts -identities -annoying -unchanged -wary -##antly -##ogy -neat -oversight -##kos -phillies -replay -constance -##kka -incarnation -humble -skies -minus -##acy -smithsonian -guerrilla -jar -cadets -##plate -surplus -audit -##aru -cracking -joanna -louisa -pacing -##lights -intentionally -##iri -diner -nwa -imprint -australians -tong -unprecedented -bunker -naive -specialists -ark -nichols -railing -leaked -pedal -##uka -shrub -longing -roofs -captains -neural -tuned -##ntal -##jet -emission -medina -frantic -codex -definitive -sid -abolition -intensified -stocks -enrique -sustain -genoa -oxide -##written -clues -cha -##gers -tributaries -fragment -venom -##ente -##sca -muffled -vain -sire -laos -##ingly -##hana -hastily -snapping -surfaced -sentiment -motive -##oft -contests -approximate -mesa -luckily -dinosaur -exchanges -propelled -accord -bourne -relieve -tow -masks -offended -##ues -cynthia -##mmer -rains -bartender -zinc -reviewers -lois -##sai -legged -arrogant -rafe -comprise -handicap -blockade -inlet -lagoon -copied -drilling -shelley -petals -##inian -mandarin -obsolete -##inated -onward -arguably -productivity -praising -seldom -busch -discusses -raleigh -shortage -ranged -stanton -encouragement -firstly -conceded -overs -temporal -##uke -cbe -##bos -woo -certainty -pumps -##pton -stalked -##uli -lizzie -periodic -thieves -weaker -gases -shoving -chooses -wc -##chemical -prompting -weights -##kill -robust -flanked -sticky -tuberculosis -##eb -##eal -christchurch -resembled -wallet -reese -inappropriate -pictured -distract -fixing -fiddle -giggled -burger -heirs -hairy -mechanic -torque -obsessed -chiefly -cheng -logging -extracted -meaningful -numb -##vsky -gloucestershire -reminding -unite -##lit -breeds -diminished -clown -glove -1860s -archibald -focal -freelance -sliced -depiction -##yk -organism -switches -sights -stray -crawling -##ril -lever -leningrad -interpretations -loops -anytime -reel -alicia -delighted -##ech -inhaled -xiv -suitcase -bernie -vega -licenses -northampton -exclusion -induction -monasteries -racecourse -homosexuality -##sfield -##rky -dimitri -michele -alternatives -ions -commentators -genuinely -objected -pork -hospitality -fencing -stephan -warships -peripheral -wit -drunken -wrinkled -quentin -spends -departing -chung -numerical -spokesperson -johannesburg -caliber -killers -##udge -assumes -neatly -demographic -abigail -bloc -mounting -##lain -bentley -slightest -xu -recipients -##jk -merlin -##writer -seniors -prisons -blinking -hindwings -flickered -kappa -##hel -80s -strengthening -appealing -brewing -gypsy -mali -lashes -hulk -unpleasant -harassment -bio -treaties -predict -instrumentation -pulp -troupe -boiling -mantle -##ffe -##vn -dividing -handles -verbs -##onal -coconut -senegal -thorough -gum -momentarily -##sto -cocaine -panicked -destined -##turing -teatro -denying -weary -captained -mans -##hawks -wakefield -bollywood -thankfully -cyril -amendments -##bahn -consultation -stud -reflections -kindness -1787 -internally -##ovo -tex -mosaic -distribute -paddy -seeming -##hic -piers -##mura -popularly -winger -kang -sentinel -mccoy -##anza -covenant -##bag -verge -fireworks -suppress -thrilled -dominate -##jar -swansea -reconciliation -stiffened -cue -dorian -##uf -damascus -amor -ida -foremost -##aga -porsche -unseen -dir -##had -##azi -stony -lexi -melodies -##nko -angular -integer -podcast -ants -inherent -jaws -justify -persona -##olved -josephine -##nr -##ressed -customary -flashes -gala -cyrus -glaring -backyard -ariel -physiology -greenland -stir -avon -atletico -finch -methodology -ked -mas -catholicism -townsend -branding -quincy -fits -containers -1777 -ashore -aragon -forearm -poisoning -adopting -conquer -grinding -amnesty -keller -finances -evaluate -forged -lankan -instincts -##uto -guam -bosnian -photographed -workplace -desirable -protector -allocation -intently -encourages -willy -##sten -bodyguard -electro -brighter -bihar -##chev -lasts -opener -amphibious -sal -verde -arte -##cope -captivity -vocabulary -yields -##tted -agreeing -desmond -pioneered -##chus -strap -campaigned -railroads -##ович -emblem -##dre -stormed -##ulous -marijuana -northumberland -##nath -bowen -landmarks -beaumont -##qua -danube -##bler -attorneys -th -flyers -critique -villains -cass -mutation -acc -##0s -colombo -mckay -motif -sampling -concluding -syndicate -##rell -neon -stables -warnings -clint -mourning -wilkinson -##tated -merrill -leopard -evenings -exhaled -emil -sonia -ezra -discrete -stove -farrell -fifteenth -prescribed -superhero -##rier -worms -helm -wren -##duction -expo -##rator -hq -unfamiliar -antony -prevents -acceleration -fiercely -mari -painfully -calculations -cheaper -ign -clifton -irvine -davenport -mozambique -pierced -##evich -wonders -##wig -##cate -##iling -crusade -ware -enzymes -reasonably -mls -##coe -mater -ambition -bunny -eliot -kernel -##fin -asphalt -headmaster -torah -aden -lush -pins -waived -##yas -joao -substrate -enforce -##grad -##ules -alvarez -selections -epidemic -tempted -bremen -translates -ensured -waterfront -29th -forrest -manny -malone -kramer -reigning -simpler -absorption -engraved -##ffy -evaluated -1778 -haze -comforting -crossover -##abe -thorn -##rift -##imo -suppression -fatigue -cutter -wurttemberg -##orf -enforced -hovering -proprietary -samurai -syllable -ascent -lacey -tick -lars -tractor -merchandise -rep -bouncing -defendants -##yre -huntington -##oko -standardized -##hor -##hima -assassinated -predecessors -rainy -liar -assurance -lyrical -##uga -secondly -flattened -parameter -undercover -##mity -bordeaux -punish -ridges -markers -exodus -inactive -hesitate -debbie -nyc -pledge -savoy -nagar -offset -organist -##tium -hesse -marin -converting -##iver -diagram -propulsion -validity -reverted -supportive -ministries -clans -responds -proclamation -##inae -ein -pleading -patriot -birch -islanders -strauss -hates -##dh -brandenburg -concession -1900s -killings -textbook -antiquity -cinematography -wharf -embarrassing -setup -creed -farmland -inequality -centred -signatures -fallon -##ingham -##uts -ceylon -gazing -directive -laurie -##tern -globally -##uated -##dent -allah -excavation -threads -##cross -frantically -icc -utilize -determines -respiratory -thoughtful -receptions -##dicate -merging -chandra -seine -builders -builds -diagnostic -dev -visibility -goddamn -analyses -dhaka -proves -chancel -concurrent -curiously -canadians -pumped -restoring -1850s -turtles -jaguar -sinister -spinal -declan -vows -1784 -glowed -capitalism -swirling -universidad -##lder -##oat -soloist -##genic -##oor -coincidence -beginnings -nissan -dip -resorts -caucasus -combustion -infectious -##eno -pigeon -serpent -##itating -conclude -masked -salad -jew -##gr -surreal -toni -##wc -harmonica -##gins -##etic -##coat -fishermen -intending -bravery -##wave -klaus -titan -wembley -taiwanese -ransom -40th -incorrect -hussein -eyelids -cooke -dramas -utilities -##etta -##print -eisenhower -principally -granada -lana -##rak -openings -concord -##bl -bethany -connie -morality -sega -##mons -##nard -earnings -##kara -##cine -communes -##rel -coma -composing -softened -severed -grapes -nguyen -analyzed -warlord -hubbard -heavenly -behave -slovenian -##hit -##ony -hailed -filmmakers -trance -caldwell -skye -unrest -coward -likelihood -##aging -bern -taliban -honolulu -propose -browser -imagining -cobra -contributes -dukes -instinctively -conan -violinist -##ores -accessories -gradual -##amp -quotes -sioux -##dating -undertake -intercepted -sparkling -compressed -fungus -tombs -haley -imposing -rests -degradation -lincolnshire -retailers -wetlands -tulsa -distributor -dungeon -nun -greenhouse -convey -atlantis -aft -exits -oman -dresser -lyons -##sti -joking -eddy -judgement -omitted -digits -##game -juniors -##rae -cents -stricken -une -##ngo -wizards -weir -breton -nan -technician -fibers -liking -royalty -persia -terribly -magician -##rable -##unt -vance -cafeteria -booker -camille -warmer -##static -consume -cavern -gaps -compass -contemporaries -foyer -soothing -graveyard -maj -plunged -blush -##wear -cascade -demonstrates -ordinance -##nov -boyle -##lana -rockefeller -shaken -banjo -izzy -##ense -breathless -vines -##eman -alterations -chromosome -dwellings -feudal -mole -catalonia -relics -tenant -mandated -##fm -fridge -hats -honesty -patented -raul -heap -cruisers -accusing -enlightenment -infants -wherein -chatham -contractors -affinity -hc -osborne -piston -traps -maturity -##rana -lagos -##zal -peering -##nay -attendant -dealers -protocols -subset -prospects -biographical -##cre -artery -##zers -insignia -nuns -endured -##eration -recommend -schwartz -serbs -berger -cromwell -crossroads -enduring -clasped -grounded -##bine -marseille -twitched -abel -choke -catalyst -moldova -italians -##tist -disastrous -wee -##oured -##nti -wwf -nope -##piration -##asa -expresses -thumbs -##nza -coca -1781 -cheating -##ption -skipped -sensory -heidelberg -spies -satan -dangers -semifinal -bohemia -whitish -confusing -shipbuilding -relies -surgeons -landings -ravi -baku -moor -suffix -alejandro -##yana -litre -upheld -##unk -rajasthan -##rek -coaster -insists -posture -scenarios -etienne -favoured -appoint -transgender -elephants -poked -greenwood -defences -fulfilled -militant -somali -1758 -chalk -potent -##ucci -migrants -wink -assistants -nos -restriction -activism -niger -##ario -colon -shaun -##sat -daphne -##erated -swam -congregations -reprise -considerations -magnet -playable -xvi -overthrow -tobias -knob -chavez -coding -##mers -propped -katrina -orient -newcomer -##suke -temperate -##pool -farmhouse -interrogation -committing -##vert -forthcoming -strawberry -joaquin -macau -ponds -shocking -siberia -##cellular -chant -contributors -##nant -##ologists -sped -absorb -hail -1782 -spared -##hore -barbados -karate -opus -originates -saul -##xie -evergreen -leaped -##rock -correlation -exaggerated -weekday -unification -bump -tracing -brig -afb -pathways -utilizing -disturbance -kneeling -##stad -##guchi -100th -pune -##thy -decreasing -manipulation -miriam -academia -ecosystem -occupational -rbi -##lem -rift -rotary -stacked -incorporation -awakening -generators -guerrero -racist -##omy -cyber -derivatives -culminated -allie -annals -panzer -sainte -pops -zu -austro -##vate -algerian -politely -nicholson -mornings -educate -tastes -thrill -dartmouth -##gating -##jee -regan -differing -concentrating -choreography -divinity -pledged -alexandre -routing -gregor -madeline -##idal -apocalypse -##hora -gunfire -culminating -elves -fined -liang -lam -programmed -tar -guessing -transparency -gabrielle -##gna -cancellation -flexibility -##lining -accession -shea -stronghold -nets -specializes -##rgan -abused -hasan -sgt -exceeding -admiration -supermarket -photographers -specialised -tilt -resonance -hmm -perfume -sami -threatens -garland -botany -guarding -boiled -greet -puppy -russo -supplier -wilmington -vibrant -vijay -##bius -paralympic -grumbled -paige -faa -licking -margins -hurricanes -##gong -fest -grenade -ripping -##uz -counseling -weigh -##sian -needles -wiltshire -edison -costly -##not -fulton -tramway -redesigned -staffordshire -gasping -watkins -sleepy -candidacy -monkeys -timeline -throbbing -##bid -##sos -berth -uzbekistan -vanderbilt -bothering -overturned -ballots -gem -##iger -sunglasses -subscribers -hooker -compelling -ang -exceptionally -saloon -stab -##rdi -carla -terrifying -##vision -coil -##oids -satisfying -vendors -31st -mackay -deities -overlooked -ambient -bahamas -felipe -olympia -whirled -botanist -advertised -tugging -disciples -morales -unionist -rites -foley -morse -motives -creepy -##₀ -soo -##sz -bargain -highness -frightening -turnpike -tory -reorganization -depict -biographer -unopposed -manifesto -##gles -institut -emile -accidental -kapoor -##dam -kilkenny -cortex -lively -romanesque -jain -shan -cannons -##ske -petrol -echoing -amalgamated -disappears -cautious -proposes -sanctions -trenton -flotilla -aus -contempt -tor -canary -cote -theirs -##hun -conceptual -deleted -fascinating -paso -blazing -elf -honourable -hutchinson -##eiro -##outh -##zin -surveyor -amidst -wooded -reissue -intro -##ono -cobb -shelters -newsletter -hanson -brace -encoding -confiscated -dem -caravan -marino -scroll -melodic -cows -imam -##adi -##aneous -northward -searches -biodiversity -cora -roaring -##bers -connell -theologian -halo -compose -pathetic -unmarried -dynamo -az -calculation -toulouse -deserves -humour -nr -forgiveness -tam -undergone -martyr -pamela -myths -whore -counselor -hicks -heavens -battleship -electromagnetic -stellar -establishments -presley -hopped -##chin -temptation -90s -wills -##yuan -nhs -##nya -seminars -##yev -adaptations -gong -asher -lex -indicator -sikh -tobago -cites -goin -##yte -satirical -##gies -characterised -correspond -bubbles -lure -participates -##vid -eruption -skate -therapeutic -1785 -canals -wholesale -defaulted -sac -petit -##zzled -virgil -leak -ravens -portraying -##yx -ghetto -creators -dams -portray -vicente -##rington -fae -namesake -bounty -##arium -joachim -##ota -##iser -aforementioned -axle -snout -depended -dismantled -reuben -##ibly -gallagher -##lau -earnest -##ieu -##iary -inflicted -objections -##llar -asa -gritted -##athy -jericho -##sea -##was -flick -underside -ceramics -undead -substituted -eastward -undoubtedly -wheeled -chimney -##iche -guinness -siding -traitor -baptiste -disguised -inauguration -tipperary -choreographer -perched -warmed -stationary -##ntes -bacterial -##aurus -flores -phosphate -attacker -invaders -alvin -intersects -indirectly -immigrated -businessmen -cornelius -valves -narrated -pill -sober -nationale -monastic -applicants -scenery -##jack -motifs -constitutes -##osh -jurisdictions -tuning -irritation -woven -##uddin -fertility -gao -##erie -antagonist -impatient -glacial -hides -boarded -denominations -interception -##jas -nicola -algebraic -marquess -bahn -parole -buyers -bait -turbines -paperwork -bestowed -natasha -renee -oceans -purchases -vaccine -##tock -fixtures -playhouse -integrate -jai -oswald -intellectuals -booked -nests -mortimer -##isi -obsession -sept -##gler -##sum -scrutiny -simultaneous -squinted -##shin -collects -oven -shankar -penned -remarkably -slips -luggage -spectral -1786 -collaborations -louie -consolidation -##ailed -##ivating -hoover -blackpool -harness -ignition -vest -tails -belmont -mongol -skinner -##nae -visually -mage -derry -##tism -##unce -stevie -transitional -##rdy -redskins -drying -prep -prospective -annoyance -oversee -##loaded -fills -##books -announces -scowled -respects -prasad -mystic -tucson -##vale -revue -springer -bankrupt -1772 -aristotle -habsburg -##geny -dal -natal -nut -pod -chewing -darts -moroccan -walkover -rosario -lenin -punjabi -##ße -grossed -scattering -wired -invasive -hui -polynomial -corridors -wakes -gina -portrays -##cratic -arid -retreating -erich -irwin -sniper -##dha -linen -lindsey -maneuver -butch -shutting -socio -bounce -commemorative -postseason -jeremiah -pines -mystical -beads -abbas -furnace -bidding -consulted -assaulted -empirical -rubble -enclosure -sob -weakly -cancel -polly -yielded -##emann -curly -prediction -battered -70s -vhs -jacqueline -render -sails -barked -detailing -grayson -riga -sloane -raging -##yah -herbs -bravo -##athlon -alloy -giggle -imminent -suffers -assumptions -waltz -##itate -accomplishments -##ited -bathing -remixed -deception -##emia -deepest -##eis -balkan -frogs -##rong -slab -##pate -philosophers -peterborough -grains -imports -dickinson -rwanda -##atics -1774 -dirk -tablets -##rove -clone -##rice -caretaker -hostilities -mclean -##gre -regimental -treasures -norms -impose -tsar -tango -diplomacy -variously -complain -recognise -arrests -1779 -celestial -pulitzer -##dus -libretto -##moor -adele -splash -expectation -lds -confronts -##izer -spontaneous -harmful -wedge -entrepreneurs -buyer -bilingual -translate -rugged -conner -circulated -uae -eaton -##gra -##zzle -lingered -lockheed -vishnu -reelection -alonso -##oom -joints -yankee -headline -cooperate -heinz -laureate -invading -##sford -echoes -scandinavian -##dham -hugging -vitamin -salute -micah -hind -trader -##sper -radioactive -##ndra -militants -poisoned -ratified -remark -campeonato -deprived -wander -prop -##dong -##tani -##eye -chiang -darcy -##oping -mandolin -spice -statesman -babylon -walled -forgetting -afro -##cap -giorgio -buffer -##polis -planetary -##gis -overlap -terminals -kinda -centenary -##bir -arising -manipulate -elm -ke -1770 -##tad -chrysler -mapped -moose -pomeranian -quad -macarthur -assemblies -shoreline -recalls -stratford -##rted -noticeable -##evic -imp -##rita -##sque -accustomed -supplying -tents -disgusted -sipped -filters -khz -reno -selecting -luftwaffe -mcmahon -tyne -masterpiece -carriages -collided -dunes -exercised -flare -remembers -muzzle -heck -##rson -burgess -lunged -middleton -boycott -bilateral -##sity -hazardous -lumpur -multiplayer -spotlight -jackets -goldman -liege -porcelain -rag -waterford -attracts -hopeful -battling -ottomans -kensington -baked -hymns -cheyenne -lattice -levine -borrow -polymer -clashes -michaels -monitored -commitments -denounced -##von -cavity -##oney -hobby -akin -##holders -futures -intricate -cornish -patty -##oned -illegally -dolphin -##lag -barlow -yellowish -maddie -apologized -luton -plagued -##puram -##rds -sway -fanny -łodz -##rino -psi -suspicions -hanged -##eding -initiate -charlton -##por -nak -competent -analytical -annex -wardrobe -reservations -sect -fairfax -hedge -piled -buckingham -uneven -bauer -simplicity -snyder -interpret -accountability -donors -moderately -byrd -continents -##cite -disciple -jamaican -nominees -##uss -mongolian -diver -attackers -eagerly -ideological -pillows -miracles -apartheid -revolver -sulfur -clinics -moran -##enko -ile -katy -rhetoric -##icated -chronology -recycling -##hrer -elongated -mughal -pascal -profiles -vibration -databases -domination -##fare -matthias -digest -rehearsal -polling -weiss -initiation -reeves -clinging -flourished -impress -##hoff -buckley -symposium -rhythms -weed -emphasize -transforming -##taking -##yman -accountant -analyze -flicker -foil -priesthood -voluntarily -decreases -##hya -slater -sv -charting -mcgill -##lde -moreno -besieged -zur -robes -##phic -admitting -deported -turmoil -peyton -earthquakes -##ares -nationalists -beau -clair -brethren -interrupt -welch -curated -galerie -requesting -##ested -impending -steward -viper -##vina -complaining -beautifully -brandy -foam -nl -1660 -alessandro -punches -laced -explanations -##lim -attribute -clit -reggie -discomfort -##cards -smoothed -whales -##cene -adler -countered -duffy -disciplinary -widening -recipe -reliance -conducts -goats -gradient -preaching -##shaw -matilda -quasi -striped -meridian -cannabis -cordoba -certificates -##agh -##tering -graffiti -hangs -pilgrims -repeats -##ych -revive -urine -etat -##hawk -fueled -belts -fuzzy -susceptible -mauritius -salle -sincere -beers -hooks -##cki -arbitration -entrusted -advise -sniffed -seminar -junk -donnell -processors -principality -strapped -celia -mendoza -everton -fortunes -prejudice -starving -reassigned -steamer -##lund -tuck -evenly -foreman -##ffen -dans -envisioned -slit -baseman -liberia -rosemary -##weed -electrified -periodically -potassium -stride -contexts -sperm -slade -mariners -influx -bianca -subcommittee -##rane -spilling -icao -estuary -##nock -delivers -##ulata -isa -mira -bohemian -dessert -##sbury -welcoming -proudly -slowing -##chs -musee -ascension -russ -##vian -waits -##psy -africans -exploit -##morphic -eccentric -crab -peck -entrances -formidable -marketplace -groom -bolted -metabolism -patton -robbins -courier -payload -endure -##ifier -andes -refrigerator -ornate -##uca -ruthless -illegitimate -masonry -strasbourg -bikes -apples -quintet -willingly -niche -bakery -corpses -energetic -##cliffe -##sser -##ards -centimeters -centro -fuscous -cretaceous -rancho -##yde -andrei -telecom -tottenham -oasis -ordination -vulnerability -presiding -corey -penguins -sims -##pis -malawi -piss -correction -##cked -##ffle -##ryn -countdown -detectives -psychiatrist -psychedelic -dinosaurs -blouse -choi -vowed -randomly -##pol -49ers -scrub -blanche -bruins -dusseldorf -##using -unwanted -##ums -dominique -elevations -headlights -om -laguna -##oga -1750 -famously -ignorance -shrewsbury -breuning -che -confederacy -greco -overhaul -##screen -paz -skirts -disagreement -cruelty -jagged -phoebe -shifter -hovered -viruses -##wes -##lined -landlord -squirrel -dashed -ornamental -gag -wally -grange -literal -spurs -undisclosed -proceeding -billie -orphan -spanned -humidity -indy -weighted -presentations -explosions -lucian -##tary -vaughn -hindus -##anga -##hell -psycho -daytona -protects -efficiently -rematch -sly -tandem -##oya -rebranded -impaired -hee -metropolis -peach -godfrey -diaspora -ethnicity -prosperous -gleaming -dar -grossing -playback -##rden -stripe -pistols -##tain -births -labelled -##cating -rudy -alba -##onne -aquarium -hostility -##tase -shudder -sumatra -hardest -lakers -consonant -creeping -demos -homicide -capsule -zeke -liberties -expulsion -pueblo -##comb -trait -transporting -##ddin -##neck -##yna -depart -gregg -mold -ledge -hangar -oldham -playboy -termination -analysts -gmbh -romero -##itic -insist -cradle -filthy -brightness -slash -shootout -deposed -bordering -##truct -microwave -tumbled -sheltered -cathy -werewolves -messy -andersen -convex -clapped -clinched -satire -wasting -edo -rufus -##jak -mont -##etti -poznan -##keeping -restructuring -transverse -##rland -azerbaijani -slovene -gestures -roommate -choking -shear -##quist -vanguard -oblivious -##hiro -disagreed -baptism -##lich -coliseum -##aceae -salvage -societe -cory -locke -relocation -relying -versailles -ahl -swelling -##elo -cheerful -##edes -gin -sarajevo -obstacle -diverted -##nac -messed -thoroughbred -fluttered -utrecht -chewed -acquaintance -assassins -dispatch -mirza -##wart -salzburg -swell -yen -##gee -idle -ligue -samson -##nds -##igh -playful -spawned -##cise -tease -##case -burgundy -stirring -skeptical -interceptions -marathi -##dies -bedrooms -aroused -pinch -##lik -preferences -tattoos -buster -digitally -projecting -rust -##ital -kitten -priorities -addison -pseudo -##guard -dusk -icons -sermon -##psis -##iba -##lift -ju -truce -rink -##dah -##wy -defects -psychiatry -offences -calculate -glucose -##iful -##rized -##unda -francaise -##hari -richest -warwickshire -carly -1763 -purity -redemption -lending -##cious -muse -bruises -cerebral -aero -carving -preface -terminology -invade -monty -anarchist -blurred -##iled -rossi -treats -guts -shu -foothills -ballads -undertaking -premise -cecilia -affiliates -blasted -conditional -wilder -minors -drone -rudolph -buffy -swallowing -horton -attested -rutherford -howell -primetime -livery -penal -##bis -minimize -hydro -wrecked -wrought -palazzo -##gling -cans -vernacular -friedman -nobleman -shale -walnut -danielle -##ection -##tley -sears -##kumar -chords -lend -flipping -streamed -por -dracula -gallons -sacrifices -gamble -orphanage -##iman -mckenzie -##gible -boxers -daly -##balls -##ان -##ific -##rative -##iq -exploited -slated -##uity -circling -hillary -pinched -goldberg -provost -campaigning -piles -ironically -jong -mohan -successors -usaf -##tem -##ught -autobiographical -haute -preserves -##ending -acquitted -comparisons -hydroelectric -gangs -cypriot -torpedoes -rushes -derive -bumps -instability -fiat -pets -##mbe -silas -dye -reckless -settler -##itation -heats -##writing -canonical -maltese -fins -mushroom -stacy -aspen -avid -##kur -##loading -vickers -gaston -hillside -statutes -wilde -gail -kung -sabine -comfortably -motorcycles -##rgo -pneumonia -fetch -##sonic -axel -faintly -parallels -##oop -mclaren -spouse -compton -interdisciplinary -miner -##eni -clamped -##chal -##llah -separates -versa -##mler -scarborough -labrador -##lity -##osing -rutgers -hurdles -como -burt -divers -wichita -cade -coincided -bruised -mla -vineyard -##ili -##brush -notch -mentioning -jase -hearted -kits -doe -##acle -pomerania -##ady -ronan -seizure -pavel -problematic -##zaki -domenico -##ulin -catering -penelope -dependence -parental -emilio -ministerial -atkinson -##bolic -clarkson -chargers -colby -grill -peeked -arises -summon -##aged -fools -##grapher -faculties -qaeda -##vial -garner -refurbished -##hwa -geelong -disasters -nudged -bs -shareholder -lori -algae -reinstated -rot -##ades -##nous -invites -stainless -inclusive -##itude -diocesan -til -##icz -denomination -##xa -benton -floral -registers -##erman -##kell -absurd -brunei -guangzhou -hitter -retaliation -##uled -##eve -blanc -nh -consistency -contamination -##eres -dire -palermo -broadcasters -diaries -inspire -vols -brewer -tightening -mixtape -hormone -##tok -stokes -##color -##dly -##ssi -##ometer -##lington -sanitation -##tility -intercontinental -##adt -¹⁄₂ -cylinders -economies -favourable -unison -croix -gertrude -odyssey -vanity -dangling -##logists -upgrades -dice -middleweight -practitioner -henrik -parlor -orion -angered -lac -blurted -##rri -sensual -intends -swings -angled -##phs -husky -attain -peerage -precinct -textiles -cheltenham -shuffled -dai -confess -tasting -bhutan -##riation -tyrone -segregation -abrupt -ruiz -##rish -smirked -blackwell -confidential -browning -amounted -vase -scarce -fabulous -raided -staple -guyana -unemployed -glider -shay -##tow -carmine -troll -intervene -squash -superstar -cylindrical -len -roadway -researched -handy -##rium -##jana -lao -declares -##rring -##tadt -##elin -##kova -willem -shrubs -napoleonic -realms -skater -volkswagen -##ł -tad -hara -archaeologist -awkwardly -eerie -##kind -wiley -##heimer -titus -organizers -cfl -crusaders -lama -vent -enraged -thankful -occupants -maximilian -##gaard -possessing -textbooks -##oran -collaborator -quaker -##ulo -avalanche -mono -silky -straits -isaiah -mustang -surged -resolutions -potomac -descend -kilograms -plato -strains -saturdays -##olin -bernstein -##ype -holstein -ponytail -belize -conversely -heroine -perpetual -##ylus -charcoal -piedmont -glee -negotiating -backdrop -prologue -##jah -pasadena -climbs -ramos -sunni -##holm -##tner -##tri -anand -deficiency -hertfordshire -stout -##avi -aperture -orioles -##irs -doncaster -intrigued -bombed -coating -otis -##mat -cocktail -##jit -##eto -amir -arousal -sar -##proof -dixie -pots -whereabouts -##fted -drains -bullying -cottages -scripture -coherent -fore -poe -appetite -##uration -sampled -##ators -derrick -rotor -jays -peacock -installment -##rro -advisors -##coming -rodeo -scotch -##mot -##fen -##vant -ensued -rodrigo -dictatorship -martyrs -twenties -towed -incidence -marta -rainforest -sai -scaled -##cles -oceanic -qualifiers -symphonic -mcbride -dislike -generalized -aubrey -colonization -##iation -##lion -##ssing -disliked -lublin -salesman -##ulates -spherical -whatsoever -sweating -avalon -contention -punt -severity -alderman -atari -##dina -##grant -##rop -scarf -seville -vertices -annexation -fairfield -fascination -inspiring -launches -palatinate -regretted -##rca -feral -##iom -elk -nap -olsen -reddy -yong -##leader -##iae -garment -transports -feng -gracie -outrage -viceroy -insides -##esis -breakup -grady -organizer -softer -grimaced -murals -galicia -arranging -vectors -##rsten -##sb -##cens -sloan -##eka -bitten -ara -fender -nausea -bumped -kris -banquet -comrades -detector -persisted -##llan -adjustment -endowed -cinemas -sellers -##uman -peek -epa -kindly -neglect -simpsons -talon -mausoleum -runaway -hangul -lookout -##cic -coughed -acquainted -chloride -quicker -accordion -neolithic -##qa -artemis -coefficient -lenny -pandora -tx -##xed -ecstasy -litter -segunda -chairperson -gemma -hiss -rumor -vow -nasal -antioch -compensate -patiently -transformers -##eded -judo -morrow -penis -posthumous -bandits -husbands -denote -flaming -##any -##phones -langley -yorker -1760 -walters -##kle -gubernatorial -fatty -leroy -outlaw -##nine -unpublished -poole -jakob -##ᵢ -##ₙ -crete -distorted -superiority -##dhi -intercept -crust -mig -claus -crashes -stallion -frontal -armistice -##estinal -elton -aj -encompassing -camel -commemorated -malaria -woodward -calf -cigar -penetrate -##oso -willard -##rno -##uche -illustrate -amusing -convergence -noteworthy -##lma -##rva -journeys -realise -manfred -##sable -##vocation -hearings -fiance -##posed -educators -provoked -adjusting -##cturing -modular -stockton -paterson -vlad -rejects -electors -selena -maureen -##tres -##rce -swirled -##num -proportions -nanny -pawn -naturalist -parma -apostles -awoke -ethel -wen -##bey -monsoon -overview -##inating -mccain -rendition -risky -adorned -##ih -equestrian -germain -nj -conspicuous -confirming -##yoshi -shivering -##imeter -milestone -rumours -flinched -bounds -smacked -token -##bei -lectured -automobiles -##shore -impacted -##iable -nouns -nero -##leaf -ismail -prostitute -trams -bridget -sud -stimulus -impressions -reins -revolves -##gned -giro -honeymoon -##swell -criterion -##sms -##uil -libyan -prefers -##osition -preview -sucks -accusation -bursts -metaphor -diffusion -tolerate -faye -betting -cinematographer -liturgical -specials -bitterly -humboldt -##ckle -flux -rattled -##itzer -archaeologists -odor -authorised -marshes -discretion -##ов -alarmed -archaic -inverse -##leton -explorers -##pine -drummond -tsunami -woodlands -##minate -##tland -booklet -insanity -owning -insert -crafted -calculus -receivers -stung -##eca -##nched -prevailing -travellers -eyeing -lila -graphs -##borne -julien -##won -morale -adaptive -therapist -erica -cw -libertarian -bowman -pitches -vita -##ional -crook -##entation -caledonia -mutiny -##sible -1840s -automation -flock -##pia -ironic -pathology -##imus -remarried -joker -withstand -energies -##att -shropshire -hostages -madeleine -tentatively -conflicting -mateo -recipes -euros -mercenaries -nico -##ndon -albuquerque -augmented -mythical -bel -freud -##child -cough -##lica -freddy -lillian -genetically -nuremberg -calder -bonn -outdoors -paste -suns -urgency -vin -restraint -tyson -##cera -##selle -barrage -bethlehem -kahn -##par -mounts -nippon -barony -happier -ryu -makeshift -sheldon -blushed -castillo -barking -listener -taped -bethel -fluent -headlines -pornography -rum -disclosure -sighing -mace -doubling -gunther -manly -##plex -interventions -physiological -forwards -emerges -##tooth -##gny -compliment -rib -recession -visibly -barge -faults -connector -exquisite -prefect -##rlin -patio -##cured -elevators -italics -pena -wasp -satin -botswana -graceful -respectable -##jima -##rter -##oic -franciscan -generates -##dl -alfredo -disgusting -##olate -##iously -sherwood -warns -cod -promo -cheryl -sino -##escu -twitch -##zhi -brownish -thom -ortiz -##dron -densely -##beat -carmel -reinforce -##bana -anastasia -downhill -vertex -contaminated -remembrance -harmonic -homework -fiancee -gears -olds -angelica -ramsay -quiz -colliery -sevens -##cape -autism -##hil -walkway -##boats -ruben -abnormal -ounce -khmer -##bbe -zachary -bedside -morphology -punching -##olar -sparrow -convinces -hewitt -queer -remastered -rods -mabel -solemn -notified -lyricist -symmetric -##xide -encore -passports -wildcats -##uni -baja -##pac -mildly -##ease -bleed -commodity -mounds -glossy -orchestras -##omo -damian -prelude -ambitions -##vet -awhile -remotely -##aud -asserts -imply -##iques -distinctly -modelling -remedy -##dded -windshield -dani -xiao -##endra -audible -powerplant -invalid -elemental -acquisitions -##hala -immaculate -libby -plata -smuggling -ventilation -denoted -minh -##morphism -differed -dion -kelley -lore -mocking -sabbath -spikes -hygiene -drown -runoff -stylized -tally -liberated -aux -interpreter -righteous -aba -siren -reaper -pearce -millie -##cier -##yra -gaius -##iso -captures -##ttering -dorm -claudio -##sic -benches -knighted -blackness -##ored -discount -fumble -oxidation -routed -novak -perpendicular -spoiled -fracture -splits -pads -topology -##cats -axes -fortunate -offenders -protestants -esteem -broadband -convened -frankly -hound -prototypes -isil -facilitated -keel -##sher -sahara -awaited -bubba -orb -prosecutors -hem -##xing -relaxing -remnant -romney -sorted -slalom -stefano -ulrich -##active -exemption -folder -pauses -foliage -hitchcock -epithet -criticisms -##aca -ballistic -brody -hinduism -chaotic -youths -equals -##pala -pts -thicker -analogous -capitalist -improvised -overseeing -sinatra -ascended -beverage -straightforward -##kon -curran -bois -induce -surveying -emperors -sax -unpopular -cartoonist -fused -##mble -unto -##yuki -localities -##cko -##ln -darlington -slain -academie -lobbying -sediment -puzzles -##grass -defiance -dickens -manifest -tongues -alumnus -arbor -coincide -appalachian -mustafa -examiner -cabaret -traumatic -yves -bracelet -draining -heroin -magnum -baths -odessa -consonants -mitsubishi -##gua -kellan -vaudeville -joked -straps -probation -##ław -ceded -interfaces -##pas -##zawa -blinding -viet -rothschild -museo -huddersfield -tactic -##storm -brackets -dazed -incorrectly -##vu -reg -glazed -fearful -manifold -benefited -irony -stumbling -##rte -willingness -balkans -mei -wraps -##aba -injected -##lea -gu -syed -harmless -##hammer -bray -takeoff -poppy -timor -cardboard -astronaut -purdue -weeping -southbound -cursing -stalls -diagonal -##neer -lamar -bryce -comte -weekdays -harrington -##uba -negatively -##see -lays -grouping -##cken -##henko -affirmed -halle -modernist -##lai -hodges -smelling -aristocratic -baptized -dismiss -justification -oilers -coupling -qin -snack -healer -##qing -gardener -layla -battled -formulated -stephenson -gravitational -##gill -1768 -granny -coordinating -suites -##ioned -monarchs -##cote -##hips -blended -barrister -deposition -fia -mina -policemen -paranoid -##pressed -churchyard -covert -crumpled -creep -abandoning -tr -transmit -conceal -barr -understands -readiness -spire -##cology -##enia -startling -unlock -vida -bowled -slots -##nat -##islav -spaced -trusting -admire -rig -slack -casualty -classmates -##odes -##rar -##rked -amherst -furnished -evolve -foundry -menace -mead -##lein -flu -wesleyan -##kled -monterey -webber -##vos -wil -##mith -##на -bartholomew -justices -restrained -##cke -amenities -mediated -sewage -trenches -mainz -##thus -1800s -##cula -##inski -caine -bonding -converts -spheres -superseded -marianne -crypt -sweaty -ensign -historia -##br -spruce -##ask -forks -thoughtfully -yukon -pamphlet -ames -##uter -karma -##yya -bryn -negotiation -sighs -incapable -##mbre -##ntial -actresses -taft -##mill -luce -prevailed -##amine -1773 -motionless -envoy -testify -investing -sculpted -instructors -provence -kali -cullen -horseback -##while -goodwin -##jos -gaa -norte -##ldon -modify -wavelength -abd -skinned -sprinter -forecast -scheduling -marries -squared -tentative -##chman -boer -##isch -bolts -swap -fisherman -assyrian -impatiently -guthrie -martins -murdoch -tanya -nicely -dolly -lacy -med -syn -decks -fashionable -millionaire -surfing -heaved -tammy -consulate -attendees -routinely -fuse -saxophonist -backseat -malaya -##lord -scowl -tau -##ishly -sighted -steaming -##rks -##holes -##hong -ching -##wife -bless -conserved -jurassic -stacey -zion -chunk -rigorous -blaine -peabody -slayer -dismay -brewers -nz -##jer -det -##glia -glover -postwar -penetration -sylvester -imitation -vertically -airlift -heiress -knoxville -viva -##uin -macon -##rim -##fighter -##gonal -janice -##orescence -##wari -marius -belongings -leicestershire -blanco -inverted -preseason -sanity -sobbing -##due -##elt -##dled -collingwood -regeneration -flickering -shortest -##mount -##osi -feminism -##lat -sherlock -cabinets -fumbled -northbound -precedent -snaps -##mme -researching -##akes -guillaume -insights -manipulated -vapor -neighbour -gangster -frey -stalking -scarcely -callie -barnett -tendencies -doomed -assessing -slung -panchayat -ambiguous -bartlett -##etto -distributing -violating -wolverhampton -##hetic -swami -histoire -##urus -liable -pounder -groin -hussain -larsen -popping -surprises -##atter -vie -curt -##station -mute -relocate -musicals -authorization -richter -##sef -immortality -tna -bombings -deteriorated -yiddish -##acious -robbed -colchester -ao -verified -balancing -apostle -swayed -recognizable -oxfordshire -retention -nottinghamshire -contender -judd -invitational -shrimp -uhf -##icient -cleaner -longitudinal -tanker -##mur -acronym -broker -koppen -sundance -suppliers -##gil -clipped -fuels -petite -##anne -landslide -helene -diversion -populous -landowners -auspices -melville -quantitative -##xes -ferries -nicky -##llus -doo -haunting -roche -carver -downed -unavailable -##pathy -approximation -hiroshima -##hue -garfield -valle -comparatively -keyboardist -traveler -##eit -congestion -calculating -subsidiaries -##bate -serb -modernization -fairies -deepened -ville -averages -##lore -inflammatory -tonga -##itch -co₂ -squads -##hea -gigantic -serum -enjoyment -retailer -verona -35th -cis -##phobic -magna -technicians -##vati -arithmetic -##sport -levin -##dation -amtrak -chow -sienna -##eyer -backstage -entrepreneurship -##otic -learnt -tao -##udy -worcestershire -formulation -baggage -hesitant -bali -sabotage -##kari -barren -enhancing -murmur -pl -freshly -putnam -syntax -aces -medicines -resentment -bandwidth -##sier -grins -chili -guido -##sei -framing -implying -gareth -lissa -genevieve -pertaining -admissions -geo -thorpe -proliferation -sato -bela -analyzing -parting -##gor -awakened -##isman -huddled -secrecy -##kling -hush -gentry -dungeons -##ego -coasts -##utz -sacrificed -##chule -landowner -mutually -prevalence -programmer -adolescent -disrupted -seaside -gee -trusts -vamp -georgie -##nesian -##iol -schedules -sindh -##market -etched -hm -sparse -bey -beaux -scratching -gliding -unidentified -collaborating -gems -jesuits -oro -accumulation -shaping -mbe -anal -##xin -enthusiasts -newscast -##egan -janata -dewey -parkinson -ankara -biennial -towering -inconsistent -##chet -thriving -terminate -cabins -furiously -eats -advocating -donkey -marley -muster -phyllis -leiden -##user -grassland -glittering -iucn -loneliness -memorandum -armenians -##ddle -popularized -rhodesia -60s -lame -##illon -sans -bikini -header -orbits -##finger -##ulator -sharif -spines -biotechnology -strolled -naughty -yates -##wire -fremantle -milo -##mour -abducted -removes -##atin -humming -##chrome -##ester -hume -pivotal -##rates -armand -grams -believers -elector -rte -apron -bis -scraped -##yria -endorsement -initials -##llation -dotted -hints -buzzing -emigration -nearer -indicators -##ulu -coarse -neutron -protectorate -##uze -directional -exploits -pains -loire -1830s -proponents -guggenheim -rabbits -ritchie -hectare -inputs -hutton -##raz -verify -##ako -boilers -longitude -##lev -skeletal -yer -emilia -citrus -compromised -##gau -prescription -paragraph -eduard -cadillac -attire -categorized -kenyan -weddings -charley -##bourg -entertain -monmouth -##lles -nutrients -davey -mesh -incentive -practised -ecosystems -kemp -subdued -overheard -##rya -bodily -maxim -##nius -apprenticeship -ursula -##fight -lodged -rug -silesian -unconstitutional -patel -inspected -coyote -unbeaten -##hak -34th -disruption -convict -parcel -##nham -collier -implicated -mallory -##iac -susannah -winkler -##rber -shia -phelps -sediments -graphical -robotic -##sner -adulthood -mart -smoked -##isto -kathryn -clarified -##aran -divides -convictions -oppression -pausing -burying -##mt -federico -mathias -eileen -##tana -kite -hunched -##acies -##atz -disadvantage -liza -kinetic -greedy -paradox -yokohama -dowager -trunks -ventured -##gement -gupta -vilnius -olaf -##thest -crimean -hopper -##ej -progressively -arturo -mouthed -arrondissement -##fusion -rubin -simulcast -oceania -##orum -##stra -##rred -busiest -intensely -navigator -cary -##vine -##hini -##bies -fife -rowe -rowland -posing -insurgents -shafts -lawsuits -activate -conor -inward -culturally -garlic -##eering -eclectic -##hui -##kee -##nl -furrowed -vargas -meteorological -rendezvous -##aus -culinary -commencement -##dition -quota -##notes -mommy -salaries -overlapping -mule -##iology -##mology -sums -wentworth -##isk -##zione -mainline -subgroup -##illy -hack -plaintiff -verdi -bulb -differentiation -engagements -multinational -supplemented -bertrand -caller -regis -##naire -##sler -##arts -##imated -blossom -propagation -kilometer -viaduct -vineyards -##uate -beckett -optimization -golfer -songwriters -seminal -semitic -thud -volatile -evolving -ridley -##wley -trivial -distributions -scandinavia -jiang -wrestled -insistence -emphasizes -napkin -##ods -adjunct -rhyme -##ricted -##eti -hopeless -surrounds -tremble -32nd -smoky -##ntly -oils -medicinal -padded -steer -wilkes -concessions -hue -uniquely -blinded -landon -##lane -hendrix -commemorating -dex -specify -chicks -##ggio -intercity -morley -##torm -highlighting -##oting -pang -oblique -stalled -##liner -flirting -newborn -1769 -bishopric -shaved -currie -dharma -spartan -##ooped -favorites -smug -novella -sirens -abusive -creations -espana -##lage -paradigm -semiconductor -sheen -##rdo -##yen -##zak -nrl -renew -##pose -##tur -adjutant -marches -norma -##enity -ineffective -weimar -grunt -##gat -lordship -plotting -expenditure -infringement -lbs -refrain -mimi -mistakenly -postmaster -1771 -##bara -ras -motorsports -tito -subjective -##zza -bully -stew -##kaya -prescott -##raphic -##zam -bids -styling -paranormal -reeve -sneaking -exploding -katz -akbar -migrant -syllables -indefinitely -##ogical -destroys -replaces -applause -##phine -pest -##fide -articulated -bertie -##cars -##ptic -courtroom -crowley -aesthetics -cummings -tehsil -hormones -titanic -dangerously -##ibe -stadion -jaenelle -auguste -ciudad -##chu -mysore -partisans -lucan -philipp -##aly -debating -henley -interiors -##rano -##tious -homecoming -beyonce -usher -henrietta -prepares -weeds -ely -plucked -##pire -##dable -luxurious -##aq -artifact -password -pasture -juno -maddy -minsk -##dder -##ologies -##rone -assessments -martian -royalist -1765 -examines -##mani -nino -parry -scooped -relativity -##eli -##uting -##cao -congregational -noisy -traverse -##agawa -strikeouts -nickelodeon -obituary -transylvania -binds -depictions -polk -trolley -##yed -##lard -breeders -##under -dryly -hokkaido -1762 -strengths -stacks -bonaparte -neared -prostitutes -stamped -anaheim -gutierrez -sinai -##zzling -bram -fresno -madhya -proton -##lena -##llum -##phon -reelected -wanda -##anus -##lb -ample -distinguishing -##yler -grasping -sermons -tomato -bland -stimulation -avenues -##eux -spreads -scarlett -fern -pentagon -assert -baird -chesapeake -calmed -distortion -fatalities -##olis -correctional -pricing -##astic -##gina -prom -dammit -ying -collaborate -##chia -welterweight -33rd -pointer -substitution -bonded -umpire -communicating -multitude -paddle -##obe -federally -intimacy -##insky -betray -ssr -##lett -##lves -##therapy -airbus -##tery -functioned -ud -bearer -biomedical -##hire -##nca -condom -brink -ik -##nical -macy -flap -gma -experimented -jelly -lavender -##icles -##ulia -munro -##mian -##tial -rye -##rle -60th -gigs -hottest -rotated -predictions -fuji -bu -##erence -##omi -barangay -##fulness -##sas -clocks -##rwood -##liness -cereal -roe -wight -decker -uttered -babu -onion -forcibly -##df -petra -sarcasm -hartley -peeled -storytelling -##xley -##ysis -##ffa -fibre -kiel -auditor -fig -harald -greenville -##berries -geographically -nell -quartz -##athic -cemeteries -crossings -nah -holloway -reptiles -chun -sichuan -snowy -corrections -##ivo -zheng -ambassadors -blacksmith -fielded -fluids -hardcover -turnover -medications -melvin -academies -##erton -roach -absorbing -spaniards -colton -##founded -outsider -espionage -kelsey -edible -##ulf -dora -establishes -##sham -##tries -contracting -##tania -cinematic -costello -nesting -##uron -connolly -duff -##nology -mma -##mata -fergus -sexes -optics -spectator -woodstock -banning -##hee -##fle -differentiate -outfielder -refinery -gerhard -horde -lair -drastically -##udi -landfall -##cheng -motorsport -odi -##achi -predominant -quay -skins -##ental -edna -harshly -complementary -murdering -##aves -wreckage -ono -outstretched -lennox -munitions -galen -reconcile -scalp -bicycles -gillespie -questionable -rosenberg -guillermo -jarvis -kabul -opium -yd -##twined -abuses -decca -outpost -##cino -sensible -neutrality -ponce -anchorage -atkins -turrets -inadvertently -disagree -libre -vodka -reassuring -weighs -##yal -glide -jumper -ceilings -repertory -outs -stain -##bial -envy -##ucible -smashing -heightened -policing -hyun -mixes -lai -prima -##ples -celeste -##bina -lucrative -intervened -kc -manually -##rned -stature -staffed -bun -bastards -nairobi -priced -##auer -thatcher -##kia -tripped -comune -##ogan -##pled -brasil -incentives -emanuel -hereford -musica -##kim -benedictine -biennale -##lani -eureka -gardiner -rb -knocks -sha -##ael -##elled -##onate -efficacy -ventura -masonic -sanford -maize -leverage -##feit -capacities -santana -##aur -novelty -vanilla -##cter -##tour -benin -##oir -neptune -drafting -tallinn -##cable -humiliation -##boarding -schleswig -fabian -bernardo -liturgy -spectacle -sweeney -pont -routledge -cosmos -ut -hilt -sleek -universally -##eville -##gawa -typed -##dry -favors -allegheny -glaciers -##rly -recalling -aziz -parasite -requiem -auf -##berto -##llin -illumination -##breaker -##issa -festivities -bows -govern -vibe -vp -sprawled -larson -pilgrim -bwf -leaping -##rts -##ssel -alexei -greyhound -hoarse -##dler -##oration -seneca -##cule -gaping -##ulously -##pura -cinnamon -##gens -##rricular -craven -fantasies -houghton -engined -reigned -dictator -supervising -##oris -bogota -commentaries -unnatural -fingernails -spirituality -tighten -canadiens -protesting -intentional -cheers -sparta -##ytic -##iere -##zine -widen -belgarath -controllers -dodd -iaaf -navarre -##ication -defect -squire -steiner -whisky -##mins -inevitably -tome -##gold -chew -##lid -elastic -##aby -streaked -alliances -jailed -regal -##ined -##phy -czechoslovak -narration -absently -##uld -bluegrass -guangdong -quran -criticizing -hose -hari -##liest -##owa -skier -streaks -deploy -##lom -raft -bose -dialed -huff -##eira -haifa -simplest -bursting -endings -sultanate -##titled -franks -whitman -ensures -sven -##ggs -collaborators -forster -organising -banished -napier -injustice -teller -layered -thump -##otti -roc -battleships -evidenced -fugitive -sadie -robotics -##roud -equatorial -geologist -##iza -yielding -##bron -##sr -internationale -mecca -##diment -skyline -toad -uploaded -reflective -undrafted -lal -leafs -bayern -##dai -lakshmi -shortlisted -##stick -##wicz -camouflage -donate -christi -lau -##acio -disclosed -nemesis -1761 -assemble -straining -northamptonshire -tal -##asi -bernardino -premature -heidi -42nd -coefficients -galactic -reproduce -buzzed -sensations -zionist -monsieur -myrtle -archery -strangled -musically -viewpoint -antiquities -bei -trailers -seahawks -cured -pee -preferring -tasmanian -lange -sul -##working -colder -overland -lucivar -massey -gatherings -haitian -##smith -disapproval -flaws -##cco -##enbach -1766 -npr -##icular -boroughs -creole -forums -techno -1755 -dent -abdominal -streetcar -##eson -##stream -procurement -gemini -predictable -##tya -acheron -christoph -feeder -fronts -vendor -bernhard -jammu -tumors -slang -##uber -goaltender -twists -curving -manson -vuelta -mer -peanut -confessions -pouch -unpredictable -allowance -theodor -vascular -##factory -bala -authenticity -metabolic -coughing -nanjing -##cea -pembroke -##bard -splendid -36th -hourly -##ahu -elmer -handel -##ivate -awarding -thrusting -experimentation -##hesion -caressed -entertained -steak -##rangle -biologist -orphans -baroness -oyster -stepfather -##dridge -mirage -reefs -speeding -barons -1764 -inhabit -preached -repealed -##tral -honoring -boogie -captives -administer -johanna -##imate -gel -suspiciously -1767 -sobs -##dington -backbone -hayward -garry -##folding -##nesia -maxi -##oof -##ppe -ellison -galileo -##stand -crimea -frenzy -amour -bumper -matrices -natalia -baking -garth -palestinians -##grove -smack -conveyed -ensembles -gardening -##manship -##rup -##stituting -1640 -harvesting -topography -shifters -dormitory -##carriage -##lston -ist -skulls -##stadt -dolores -jewellery -sarawak -##wai -##zier -fences -christy -confinement -tumbling -credibility -fir -stench -##bria -##plication -##nged -##sam -virtues -##belt -marjorie -pba -##eem -##made -celebrates -schooner -agitated -barley -fulfilling -anthropologist -restrict -novi -regulating -##nent -padres -##rani -##hesive -loyola -tabitha -milky -olson -proprietor -crambidae -guarantees -intercollegiate -ljubljana -hilda -##sko -ignorant -hooded -sardinia -##lidae -##vation -frontman -privileged -witchcraft -jammed -laude -poking -##than -bracket -amazement -yunnan -##erus -maharaja -linnaeus -commissioning -milano -peacefully -##logies -akira -rani -regulator -grasses -##rance -luzon -crows -compiler -gretchen -seaman -edouard -buccaneers -ellington -hamlets -whig -socialists -##anto -directorial -easton -mythological -##kr -##vary -rhineland -semantic -taut -dune -inventions -succeeds -##iter -replication -branched -##pired -prosecuted -kangaroo -penetrated -##avian -middlesbrough -doses -bleak -madam -predatory -relentless -##vili -reluctance -##vir -hailey -crore -silvery -1759 -monstrous -swimmers -transmissions -hawthorn -informing -##eral -toilets -caracas -crouch -##sett -cartel -hadley -##aling -alexia -yvonne -##biology -cinderella -eton -superb -blizzard -stabbing -industrialist -maximus -##orus -groves -maud -clade -oversized -comedic -##bella -rosen -nomadic -fulham -montane -beverages -galaxies -redundant -swarm -##rot -##folia -##llis -buckinghamshire -fen -bearings -bahadur -##rom -gilles -phased -dynamite -faber -benoit -##ount -fractured -tailored -anya -spices -westwood -cairns -auditions -inflammation -steamed -##rocity -##acion -##urne -skyla -thereof -watford -torment -archdeacon -transforms -demeanor -fucked -serge -##sor -mckenna -minas -entertainer -##icide -caress -originate -residue -##sty -1740 -##ilised -##org -beech -##wana -subsidies -##ghton -emptied -gladstone -firefighters -voodoo -het -nightingale -tamara -edmond -ingredient -weaknesses -silhouette -compatibility -withdrawing -hampson -##mona -anguish -giggling -bookstore -southernmost -tilting -##vance -bai -economical -briefcase -dreadful -hinted -projections -shattering -totaling -##rogate -analogue -indicted -periodical -fullback -##dman -haynes -##tenberg -##ffs -##ishment -1745 -thirst -stumble -penang -vigorous -##ddling -##kor -##lium -octave -##ove -##enstein -##inen -##ones -siberian -##uti -cbn -repeal -swaying -##vington -khalid -tanaka -unicorn -otago -plastered -lobe -riddle -##rella -perch -##ishing -croydon -filtered -graeme -tripoli -##ossa -crocodile -##chers -sufi -mined -##tung -inferno -lsu -##phi -swelled -utilizes -£2 -cale -periodicals -styx -hike -informally -coop -lund -##tidae -ala -hen -qui -transformations -disposed -sheath -chickens -##cade -fitzroy -silesia -unacceptable -odisha -1650 -sabrina -spokane -ratios -athena -massage -shen -dilemma -##drum -##riz -##hul -corona -doubtful -niall -##pha -##bino -fines -cite -acknowledging -bangor -ballard -bathurst -##resh -huron -mustered -alzheimer -garments -kinase -tyre -warship -flashback -pulmonary -braun -cheat -kamal -cyclists -constructions -grenades -ndp -traveller -excuses -stomped -signalling -trimmed -futsal -mosques -relevance -##wine -wta -##vah -hoc -##riding -optimistic -##´s -deco -interacting -rejecting -moniker -waterways -##ieri -##oku -mayors -gdansk -outnumbered -pearls -##ended -##hampton -fairs -totals -dominating -notions -stairway -compiling -pursed -commodities -grease -yeast -##jong -carthage -griffiths -residual -amc -contraction -laird -sapphire -##marine -##ivated -amalgamation -dissolve -inclination -lyle -packaged -altitudes -suez -canons -graded -lurched -narrowing -boasts -guise -enrico -##ovsky -rower -scarred -bree -cub -iberian -protagonists -bargaining -proposing -trainers -voyages -fishes -##aea -##ivist -##verance -encryption -artworks -kazan -sabre -cleopatra -hepburn -rotting -supremacy -mecklenburg -##brate -burrows -hazards -outgoing -flair -organizes -##ctions -scorpion -##usions -boo -chevalier -dunedin -slapping -ineligible -pensions -##omic -manufactures -emails -bismarck -weakening -blackish -ding -mcgee -quo -##rling -northernmost -manpower -greed -sampson -clicking -##ange -##horpe -##inations -##roving -torre -##eptive -##moral -symbolism -38th -asshole -meritorious -outfits -splashed -biographies -sprung -astros -##tale -filly -raoul -nw -tokugawa -linden -clubhouse -##apa -tracts -romano -##pio -putin -chained -dickson -gunshot -moe -gunn -rashid -##tails -zipper -##bas -##nea -contrasted -##ply -##udes -plum -pharaoh -##pile -aw -comedies -ingrid -sandwiches -subdivisions -mariana -kamen -hz -delaney -veto -herring -##words -possessive -outlines -##roup -siemens -stairwell -gallantry -messiah -palais -yells -zeppelin -bolivar -##cede -smackdown -mckinley -##mora -##yt -muted -geologic -finely -unitary -avatar -hamas -maynard -rees -bog -contrasting -##rut -liv -chico -disposition -##erate -becca -dmitry -yeshiva -narratives -##lva -##ulton -mercenary -sharpe -tempered -navigate -stealth -amassed -keynes -##lini -untouched -##rrie -havoc -lithium -##fighting -abyss -graf -southward -wolverine -balloons -implements -ngos -transitions -##icum -ambushed -concacaf -dormant -economists -##dim -costing -csi -rana -universite -boulders -verity -##llon -collin -mellon -misses -cypress -fluorescent -lifeless -spence -##ulla -crewe -shepard -pak -revelations -jolly -gibbons -paw -##dro -##quel -freeing -shack -fries -palatine -##hiko -accompaniment -cruising -recycled -##aver -erwin -sorting -synthesizers -dyke -realities -strides -enslaved -wetland -##ghan -competence -gunpowder -grassy -maroon -reactors -objection -##oms -carlson -gearbox -macintosh -radios -shelton -##sho -clergyman -prakash -mongols -trophies -oricon -stimuli -twenty20 -cantonese -cortes -mirrored -##saurus -bhp -cristina -melancholy -##lating -enjoyable -nuevo -##wny -downfall -schumacher -##ind -banging -lausanne -rumbled -paramilitary -reflex -ax -amplitude -migratory -##gall -##ups -midi -barnard -lastly -sherry -##nall -keystone -##kra -carleton -slippery -coloring -foe -socket -otter -##rgos -mats -##tose -consultants -bafta -bison -topping -primal -abandonment -transplant -atoll -hideous -mort -pained -reproduced -tae -howling -##turn -unlawful -billionaire -hotter -poised -lansing -##chang -dinamo -retro -messing -domesday -##mina -blitz -timed -##athing -##kley -ascending -gesturing -##izations -signaled -tis -chinatown -mermaid -savanna -jameson -##aint -catalina -##pet -##hers -cochrane -cy -chatting -##kus -alerted -computation -mused -noelle -majestic -mohawk -campo -octagonal -##sant -##hend -aspiring -##mart -comprehend -iona -paralyzed -shimmering -swindon -rhone -##eley -reputed -configurations -pitchfork -agitation -francais -gillian -lipstick -##ilo -outsiders -pontifical -resisting -bitterness -sewer -rockies -##edd -##ucher -misleading -1756 -exiting -galloway -##nging -risked -##heart -commemoration -schultz -##rka -integrating -##rsa -poses -shrieked -##weiler -guineas -gladys -jerking -owls -goldsmith -nightly -penetrating -##unced -lia -ignited -betsy -##aring -##thorpe -follower -vigorously -##rave -coded -kiran -knit -zoology -tbilisi -##bered -repository -govt -deciduous -dino -growling -##bba -enhancement -unleashed -chanting -pussy -biochemistry -##eric -kettle -repression -toxicity -nrhp -##arth -##kko -##bush -ernesto -commended -outspoken -mca -parchment -kristen -##aton -bisexual -raked -glamour -navajo -conditioned -showcased -##hma -spacious -youthful -##esa -usl -appliances -junta -brest -layne -conglomerate -enchanted -chao -loosened -picasso -circulating -inspect -montevideo -##centric -##kti -piazza -spurred -##aith -bari -freedoms -poultry -stamford -lieu -indigo -sarcastic -bahia -stump -attach -dvds -frankenstein -lille -approx -scriptures -pollen -##script -nmi -overseen -##ivism -tides -proponent -newmarket -inherit -milling -##erland -centralized -##rou -distributors -credentials -drawers -abbreviation -##lco -downing -uncomfortably -ripe -##oes -erase -franchises -populace -##bery -##khar -decomposition -pleas -##tet -daryl -sabah -##wide -fearless -genie -lesions -annette -##ogist -oboe -appendix -nair -dripped -petitioned -maclean -mosquito -parrot -hampered -1648 -operatic -reservoirs -##tham -irrelevant -jolt -summarized -##fp -medallion -##taff -clawed -harlow -narrower -goddard -marcia -bodied -fremont -suarez -altering -tempest -mussolini -porn -##isms -sweetly -oversees -walkers -solitude -grimly -shrines -ich -supervisors -hostess -dietrich -legitimacy -brushes -expressive -##yp -dissipated -##rse -localized -systemic -##nikov -gettysburg -##uaries -dialogues -muttering -housekeeper -sicilian -discouraged -##frey -beamed -kaladin -halftime -kidnap -##amo -##llet -1754 -synonymous -depleted -instituto -insulin -reprised -##opsis -clashed -##ctric -interrupting -radcliffe -insisting -medici -1715 -ejected -playfully -turbulent -starvation -##rini -shipment -rebellious -petersen -verification -merits -##rified -cakes -##charged -1757 -milford -shortages -spying -fidelity -##aker -emitted -storylines -harvested -seismic -##iform -cheung -kilda -theoretically -barbie -lynx -##rgy -##tius -goblin -mata -poisonous -##nburg -reactive -residues -obedience -##евич -conjecture -##rac -hating -sixties -kicker -moaning -motown -##bha -emancipation -neoclassical -##hering -consoles -ebert -professorship -##tures -sustaining -assaults -obeyed -affluent -incurred -tornadoes -##eber -##zow -emphasizing -highlanders -cheated -helmets -##ctus -internship -terence -bony -executions -legislators -berries -peninsular -tinged -##aco -1689 -amplifier -corvette -ribbons -lavish -pennant -##lander -worthless -##chfield -##forms -mariano -pyrenees -expenditures -##icides -chesterfield -mandir -tailor -39th -sergey -nestled -willed -aristocracy -devotees -goodnight -raaf -rumored -weaponry -remy -appropriations -harcourt -burr -riaa -##lence -limitation -unnoticed -guo -soaking -swamps -##tica -collapsing -tatiana -descriptive -brigham -psalm -##chment -maddox -##lization -patti -caliph -##aja -akron -injuring -serra -##ganj -basins -##sari -astonished -launcher -##church -hilary -wilkins -sewing -##sf -stinging -##fia -##ncia -underwood -startup -compilations -vibrations -embankment -jurist -bard -juventus -groundwater -kern -palaces -helium -boca -cramped -marissa -soto -##worm -jae -princely -##ggy -faso -bazaar -warmly -##voking -pairing -##lite -##grate -##nets -wien -freaked -ulysses -rebirth -##alia -mummy -guzman -jimenez -stilled -##nitz -trajectory -tha -woken -archival -professions -##pts -##pta -hilly -shadowy -shrink -##bolt -norwood -glued -migrate -stereotypes -devoid -##pheus -evacuate -horrors -infancy -gotham -knowles -optic -downloaded -sachs -kingsley -parramatta -darryl -mor -##onale -shady -commence -confesses -kan -##meter -##placed -marlborough -roundabout -regents -frigates -##imating -gothenburg -revoked -carvings -clockwise -convertible -intruder -##sche -banged -##ogo -vicky -bourgeois -##mony -dupont -footing -##gum -##real -buckle -yun -penthouse -sane -serviced -stakeholders -neumann -##eers -comb -##gam -catchment -pinning -rallies -typing -##elles -forefront -freiburg -sweetie -giacomo -widowed -goodwill -worshipped -aspirations -midday -##vat -fishery -##trick -bournemouth -turk -hearth -ethanol -guadalajara -murmurs -sl -##uge -afforded -scripted -##hta -wah -##jn -coroner -translucent -memorials -puck -progresses -clumsy -##race -candace -recounted -##slin -##uve -filtering -##mac -howl -strata -heron -leveled -##ays -dubious -##oja -##wheel -citations -exhibiting -##laya -##mics -turkic -##lberg -injunction -##ennial -antibodies -organise -##rigues -cardiovascular -cushion -inverness -##zquez -dia -cocoa -sibling -##tman -##roid -expanse -feasible -tunisian -algiers -##relli -rus -dso -westphalia -bro -tacoma -downloads -##ours -konrad -duran -##hdi -continuum -jett -compares -legislator -secession -##nable -##gues -##zuka -translating -reacher -##gley -##ła -aleppo -##agi -orchards -trapping -linguist -versatile -drumming -postage -calhoun -superiors -##mx -barefoot -leary -##cis -ignacio -alfa -kaplan -##rogen -bratislava -mori -##vot -disturb -haas -cartridges -gilmore -radiated -salford -tunic -hades -##ulsive -archeological -delilah -magistrates -auditioned -brewster -charters -empowerment -blogs -cappella -dynasties -iroquois -whipping -##krishna -raceway -truths -myra -weaken -judah -mcgregor -##horse -mic -refueling -37th -burnley -bosses -markus -premio -query -##gga -dunbar -##economic -darkest -lyndon -sealing -commendation -reappeared -##mun -addicted -ezio -slaughtered -satisfactory -shuffle -##eves -##thic -##uj -fortification -warrington -##otto -resurrected -fargo -mane -##utable -##lei -foreword -ox -##aris -##vern -abrams -hua -##mento -sakura -##alo -sentimental -##skaya -midfield -##eses -sturdy -scrolls -macleod -##kyu -entropy -##lance -mitochondrial -cicero -excelled -thinner -convoys -perceive -##oslav -##urable -systematically -grind -burkina -##tagram -ops -##aman -guantanamo -##cloth -##tite -forcefully -wavy -##jou -pointless -##linger -##tze -layton -portico -superficial -clerical -outlaws -##hism -burials -muir -##inn -creditors -hauling -rattle -##leg -calais -monde -archers -reclaimed -dwell -wexford -hellenic -falsely -remorse -##tek -dough -furnishings -##uttered -gabon -neurological -novice -##igraphy -contemplated -pulpit -nightstand -saratoga -##istan -documenting -pulsing -taluk -##firmed -busted -marital -##rien -disagreements -wasps -##yes -hodge -mcdonnell -mimic -fran -pendant -dhabi -musa -##nington -congratulations -argent -darrell -concussion -losers -regrets -thessaloniki -reversal -donaldson -hardwood -thence -achilles -ritter -##eran -demonic -jurgen -prophets -goethe -eki -classmate -##cking -yank -irrational -##inging -perished -seductive -qur -sourced -##crat -##typic -mustard -ravine -barre -horizontally -characterization -phylogenetic -boise -##dit -##runner -##tower -brutally -intercourse -seduce -##bbing -fay -ferris -ogden -amar -nik -unarmed -##inator -evaluating -kyrgyzstan -sweetness -##lford -##oki -mccormick -meiji -notoriety -stimulate -disrupt -figuring -instructional -mcgrath -##zoo -groundbreaking -##lto -flinch -khorasan -agrarian -bengals -mixer -radiating -##sov -ingram -pitchers -nad -tariff -##cript -tata -##codes -##emi -##ungen -appellate -lehigh -##bled -##giri -brawl -duct -texans -##ciation -##ropolis -skipper -speculative -vomit -doctrines -stresses -davy -graders -whitehead -jozef -timely -cumulative -haryana -paints -appropriately -boon -cactus -##ales -##pid -dow -legions -##pit -perceptions -1730 -picturesque -##yse -periphery -rune -wr -##aha -celtics -sentencing -whoa -##erin -confirms -variance -moines -mathews -spade -rave -fronted -blending -alleging -reared -##paper -grassroots -eroded -##physical -directs -ordeal -##sław -accelerate -hacker -rooftop -##inia -lev -buys -cebu -devote -##lce -specialising -##ulsion -choreographed -repetition -warehouses -##ryl -paisley -tuscany -analogy -sorcerer -hash -huts -shards -descends -exclude -nix -chaplin -ito -vane -##drich -causeway -misconduct -limo -orchestrated -glands -jana -##kot -u2 -##sons -branching -contrasts -scoop -longed -##virus -chattanooga -syrup -cornerstone -##tized -##mind -##iaceae -careless -precedence -frescoes -##uet -chilled -consult -modelled -snatch -peat -##thermal -caucasian -humane -relaxation -spins -temperance -##lbert -occupations -lambda -hybrids -moons -##oese -rolf -societal -yerevan -ness -##ssler -befriended -mechanized -nominate -trough -boasted -cues -seater -##hom -bends -##tangle -conductors -emptiness -eurasian -adriatic -tian -##cie -anxiously -lark -propellers -chichester -jock -##holding -credible -recounts -tori -loyalist -abduction -##hoot -##redo -nepali -##mite -ventral -tempting -##ango -##crats -steered -##wice -javelin -dipping -laborers -prentice -looming -titanium -badges -emir -tensor -##ntation -egyptians -rash -denies -hawthorne -lombard -showers -wehrmacht -dietary -trojan -##reus -welles -executing -horseshoe -lifeboat -##lak -elsa -infirmary -nearing -roberta -boyer -mutter -trillion -joanne -##fine -##oked -sinks -vortex -uruguayan -clasp -sirius -##block -accelerator -prohibit -sunken -byu -chronological -diplomats -ochreous -symmetrical -1644 -maia -##tology -salts -reigns -atrocities -##ия -hess -bared -issn -##vyn -cater -saturated -##cycle -##isse -sable -voyager -dyer -yusuf -##inge -fountains -wolff -##nni -engraving -rollins -atheist -ominous -##ault -herr -chariot -martina -strung -##fell -##farlane -horrific -sahib -gazes -saetan -erased -ptolemy -##olic -flushing -lauderdale -analytic -##ices -navarro -beak -gorilla -herrera -broom -guadalupe -raiding -sykes -bsc -deliveries -1720 -invasions -carmichael -tajikistan -thematic -ecumenical -sentiments -onstage -##rians -##brand -##sume -catastrophic -flanks -molten -##arns -waller -aimee -terminating -##icing -alternately -##oche -nehru -printers -outraged -##eving -empires -template -banners -repetitive -za -##oise -vegetarian -##tell -guiana -opt -cavendish -lucknow -synthesized -##hani -##mada -finalized -##ctable -fictitious -mayoral -unreliable -##enham -embracing -peppers -rbis -##chio -##neo -inhibition -slashed -togo -orderly -embroidered -salty -barron -benito -totaled -##dak -pubs -simulated -caden -devin -tolkien -momma -welding -sesame -##ept -gottingen -hardness -shaman -temeraire -adequately -pediatric -assertion -radicals -composure -cadence -seafood -beaufort -lazarus -mani -warily -cunning -kurdistan -cantata -##kir -ares -##clusive -nape -townland -geared -insulted -flutter -boating -violate -draper -dumping -malmo -##hh -##romatic -firearm -alta -bono -obscured -##clave -exceeds -panorama -unbelievable -##train -preschool -##essed -disconnected -installing -rescuing -secretaries -accessibility -##castle -##ifice -##film -bouts -slug -waterway -mindanao -##buro -##ratic -halves -calming -liter -maternity -adorable -bragg -electrification -mcc -##dote -roxy -schizophrenia -munoz -kaye -whaling -mil -tingling -tolerant -##ago -unconventional -volcanoes -##finder -deportivo -##llie -robson -kaufman -neuroscience -wai -deportation -masovian -scraping -converse -##bh -hacking -bulge -##oun -administratively -yao -mammoth -booster -claremont -hooper -nomenclature -pursuits -mclaughlin -melinda -##sul -catfish -barclay -substrates -taxa -zee -kimberly -packets -padma -##ality -borrowing -ostensibly -solvent -##bri -##genesis -##mist -lukas -shreveport -veracruz -##lou -##wives -cheney -anatolia -hobbs -##zyn -cyclic -radiant -alistair -greenish -siena -dat -independents -##bation -conform -pieter -hyper -applicant -bradshaw -spores -telangana -vinci -inexpensive -nuclei -jang -nme -spd -cradled -receptionist -pow -##rika -fascism -##ifer -experimenting -##ading -##iec -##region -jocelyn -maris -stair -nocturnal -toro -constabulary -elgin -##kker -msc -##giving -##schen -##rase -doherty -doping -sarcastically -batter -maneuvers -##cano -##apple -##gai -##git -intrinsic -##nst -##stor -1753 -showtime -cafes -gasps -lviv -ushered -##thed -fours -restart -astonishment -transmitting -flyer -shrugs -##sau -intriguing -cones -dictated -mushrooms -medial -##kovsky -##elman -escorting -gaped -godfather -##door -##sell -djs -recaptured -timetable -vila -1710 -aerodrome -mortals -scientology -##orne -angelina -mag -convection -unpaid -insertion -intermittent -lego -##nated -endeavor -kota -pereira -##lz -bwv -glamorgan -insults -agatha -fey -##cend -fleetwood -mahogany -protruding -steamship -zeta -##arty -mcguire -suspense -##sphere -advising -urges -##wala -hurriedly -meteor -gilded -inline -arroyo -stalker -##oge -excitedly -revered -##cure -earle -introductory -##break -##ilde -mutants -puff -pulses -reinforcement -##haling -curses -lizards -stalk -correlated -##fixed -fallout -macquarie -##unas -bearded -denton -heaving -##ocation -winery -assign -dortmund -##lkirk -everest -invariant -charismatic -susie -##elling -bled -lesley -telegram -sumner -bk -##ogen -wilcox -needy -colbert -duval -##iferous -##mbled -allotted -attends -imperative -##hita -replacements -hawker -##inda -insurgency -##zee -##eke -casts -##yla -ives -transitioned -##pack -##powering -authoritative -baylor -flex -cringed -plaintiffs -woodrow -##skie -drastic -ape -aroma -unfolded -commotion -preoccupied -theta -routines -lasers -privatization -wand -domino -ek -clenching -nsa -strategically -showered -bile -handkerchief -pere -storing -christophe -insulting -nakamura -romani -asiatic -magdalena -palma -cruises -stripping -konstantin -soaring -##berman -colloquially -forerunner -havilland -incarcerated -parasites -sincerity -##utus -disks -plank -saigon -##ining -corbin -homo -ornaments -powerhouse -##tlement -chong -fastened -feasibility -idf -morphological -usable -##nish -##zuki -aqueduct -jaguars -keepers -##flies -aleksandr -faust -assigns -ewing -bacterium -hurled -tricky -hungarians -integers -wallis -yamaha -##isha -hushed -oblivion -aviator -evangelist -friars -##eller -monograph -ode -##nary -airplanes -labourers -charms -##nee -1661 -hagen -tnt -rudder -fiesta -transcript -dorothea -ska -inhibitor -maccabi -retorted -raining -encompassed -clauses -menacing -1642 -lineman -##gist -vamps -##dick -gloom -##rera -dealings -easing -seekers -##nut -##pment -helens -unmanned -##anu -##isson -basics -##amy -##ckman -adjustments -1688 -brutality -horne -##zell -##mable -aggregator -##thal -rhino -##drick -##vira -counters -##rting -mn -montenegrin -packard -##unciation -##♭ -##kki -reclaim -scholastic -thugs -pulsed -##icia -syriac -quan -saddam -banda -kobe -blaming -buddies -dissent -##lusion -##usia -corbett -jaya -delle -erratic -lexie -##hesis -amiga -hermes -##pressing -##leen -chapels -gospels -jamal -##uating -compute -revolving -warp -##sso -##thes -armory -##eras -##gol -antrim -loki -##kow -##asian -##good -##zano -braid -handwriting -subdistrict -funky -pantheon -##iculate -concurrency -estimation -improper -juliana -##his -newcomers -johnstone -staten -communicated -##oco -##alle -sausage -stormy -##stered -##tters -superfamily -##grade -acidic -collateral -tabloid -##oped -##rza -bladder -austen -##ellant -mcgraw -##hay -hannibal -mein -aquino -lucifer -wo -badger -boar -cher -christensen -greenberg -interruption -##kken -jem -mocked -bottoms -cambridgeshire -##lide -sprawling -##bbly -eastwood -ghent -synth -##buck -advisers -##bah -nominally -hapoel -qu -daggers -estranged -fabricated -towels -vinnie -wcw -misunderstanding -anglia -nothin -unmistakable -##dust -##lova -chilly -marquette -truss -##edge -##erine -reece -##lty -##chemist -##connected -41st -bash -raion -waterfalls -##ump -##main -labyrinth -queue -theorist -##istle -bharatiya -flexed -soundtracks -rooney -leftist -patrolling -wharton -plainly -alleviate -eastman -schuster -topographic -engages -immensely -unbearable -fairchild -1620 -dona -lurking -parisian -oliveira -ia -indictment -hahn -bangladeshi -##aster -##uming -##ential -antonia -expects -indoors -kildare -harlan -##logue -##ogenic -##sities -forgiven -##wat -childish -tavi -##mide -##orra -plausible -grimm -successively -scooted -##bola -##rith -spartans -emery -flatly -epilogue -##wark -flourish -##iny -##tracted -##overs -##oshi -bestseller -distressed -receipt -spitting -hermit -topological -##cot -drilled -subunit -francs -##layer -eel -##fk -##itas -octopus -footprint -petitions -##say -##foil -interfering -leaking -palo -##metry -thistle -valiant -##pic -narayan -mcpherson -##fast -gonzales -##enne -dustin -novgorod -solos -##zman -doin -##patient -##meyer -soluble -ashland -cuffs -carole -pendleton -whistling -vassal -##river -deviation -revisited -constituents -rallied -rotate -loomed -##eil -##nting -amateurs -augsburg -auschwitz -crowns -skeletons -##cona -bonnet -dummy -globalization -simeon -sleeper -mandal -differentiated -##crow -##mare -milne -bundled -exasperated -talmud -owes -segregated -##feng -##uary -dentist -piracy -props -##rang -devlin -##torium -malicious -paws -##laid -dependency -##ergy -##fers -##enna -pistons -rourke -jed -grammatical -tres -maha -wig -ghostly -jayne -##achal -##creen -##ilis -##lins -designate -##with -arrogance -cambodian -clones -showdown -throttle -twain -##ception -lobes -metz -nagoya -braking -##furt -roaming -##minster -amin -crippled -##llary -indifferent -hoffmann -idols -intimidating -1751 -influenza -memo -onions -1748 -bandage -consciously -##landa -##rage -clandestine -observes -swiped -tangle -##ener -##jected -##trum -##bill -##lta -hugs -congresses -josiah -spirited -##dek -humanist -managerial -filmmaking -inmate -rhymes -debuting -grimsby -ur -##laze -duplicate -vigor -republished -bolshevik -refurbishment -antibiotics -martini -methane -newscasts -royale -horizons -levant -iain -visas -##ischen -paler -##around -manifestation -snuck -alf -chop -futile -pedestal -rehab -##kat -bmg -kerman -res -fairbanks -jarrett -abstraction -saharan -##zek -1746 -procedural -clearer -kincaid -sash -luciano -##ffey -crunch -helmut -##vara -revolutionaries -##tute -creamy -leach -##mmon -1747 -permitting -nes -plight -wendell -##lese -contra -clancy -ipa -mach -staples -autopsy -disturbances -nueva -karin -pontiac -##uding -proxy -venerable -haunt -leto -bergman -expands -##helm -wal -##pipe -canning -celine -cords -obesity -##enary -intrusion -planner -##phate -reasoned -sequencing -harrow -##chon -##dora -marred -mcintyre -repay -tarzan -darting -harrisburg -margarita -repulsed -##lding -belinda -hamburger -novo -compliant -runways -bingham -registrar -skyscraper -cuthbert -improvisation -livelihood -##corp -##elial -admiring -##dened -sporadic -believer -casablanca -popcorn -asha -shovel -##bek -##dice -coiled -tangible -##dez -casper -elsie -resin -tenderness -rectory -##ivision -avail -sonar -##mori -boutique -##dier -guerre -bathed -upbringing -vaulted -sandals -blessings -##naut -##utnant -1680 -foxes -pia -corrosion -hesitantly -confederates -crystalline -footprints -shapiro -tirana -valentin -drones -45th -microscope -shipments -texted -inquisition -wry -guernsey -unauthorized -resigning -ripple -schubert -stu -reassure -felony -##ardo -brittle -koreans -##havan -##ives -dun -implicit -tyres -##aldi -##lth -magnolia -##ehan -##puri -##poulos -aggressively -fei -gr -familiarity -##poo -indicative -##trust -fundamentally -jimmie -overrun -anchors -moans -##opus -britannia -armagh -purposely -seizing -##vao -bewildered -mundane -avoidance -cosmopolitan -geometridae -quartermaster -caf -chatter -engulfed -gleam -purge -##icate -juliette -jurisprudence -guerra -revisions -##bn -casimir -brew -##jm -1749 -clapton -cloudy -conde -hermitage -simulations -torches -vincenzo -matteo -##rill -hidalgo -booming -westbound -accomplishment -tentacles -unaffected -##sius -annabelle -flopped -sloping -##litz -dreamer -interceptor -vu -##loh -consecration -copying -messaging -breaker -climates -hospitalized -1752 -torino -afternoons -winfield -witnessing -##teacher -breakers -choirs -sawmill -coldly -##ege -sipping -haste -uninhabited -conical -bibliography -pamphlets -severn -edict -##oca -deux -illnesses -grips -rehearsals -sis -thinkers -tame -##keepers -1690 -acacia -reformer -##osed -##rys -shuffling -##iring -##shima -eastbound -ionic -rhea -flees -littered -##oum -rocker -vomiting -groaning -champ -overwhelmingly -civilizations -paces -sloop -adoptive -##tish -skaters -##vres -aiding -nikola -shriek -##ignon -pharmaceuticals -tuna -calvert -gustavo -stocked -yearbook -##urai -##mana -computed -subsp -riff -hanoi -kelvin -hamid -moors -pastures -summons -jihad -nectar -##ctors -bayou -untitled -pleasing -vastly -republics -intellect -##ulio -##tou -crumbling -stylistic -##ی -consolation -frequented -h₂o -walden -widows -##iens -##ignment -chunks -improves -grit -recited -##dev -snarl -sociological -##arte -##gul -inquired -##held -bruise -clube -consultancy -homogeneous -hornets -multiplication -pasta -prick -savior -##grin -##kou -##phile -yoon -##gara -grimes -vanishing -cheering -reacting -bn -distillery -##quisite -##vity -coe -dockyard -massif -##jord -escorts -voss -##valent -byte -chopped -hawke -illusions -workings -floats -##koto -##vac -kv -annapolis -madden -##onus -alvaro -noctuidae -##cum -##scopic -avenge -steamboat -forte -illustrates -erika -##trip -dew -nationalities -bran -manifested -thirsty -diversified -muscled -reborn -##standing -arson -##lessness -##dran -##logram -##boys -##kushima -##vious -willoughby -##phobia -alsace -dashboard -yuki -##chai -granville -myspace -publicized -tricked -##gang -adjective -##ater -relic -reorganisation -enthusiastically -indications -saxe -##lassified -consolidate -iec -padua -helplessly -ramps -renaming -regulars -pedestrians -accents -convicts -inaccurate -lowers -mana -##pati -barrie -bjp -outta -someplace -berwick -flanking -invoked -marrow -sparsely -excerpts -clothed -rei -##ginal -wept -##straße -##vish -##ptive -membranes -aquitaine -creeks -cutler -sheppard -implementations -##dur -fragrance -budge -concordia -magnesium -marcelo -##antes -gladly -vibrating -##rral -##ggles -montrose -##omba -lew -seamus -1630 -cocky -##ament -##uen -bjorn -##rrick -fielder -fluttering -##lase -methyl -kimberley -mcdowell -reductions -barbed -##jic -##tonic -aeronautical -condensed -distracting -##promising -huffed -##cala -##sle -claudius -invincible -missy -pious -balthazar -##lang -butte -combo -orson -##dication -myriad -1707 -silenced -##fed -##rh -netball -yourselves -##oza -clarify -heller -peg -durban -etudes -offender -roast -blackmail -curvature -##woods -vile -illicit -suriname -##linson -overture -1685 -bubbling -gymnast -tucking -##mming -##ouin -maldives -##bala -gurney -##dda -##eased -##oides -backside -pinto -jars -racehorse -tending -##rdial -baronetcy -wiener -duly -##rke -barbarian -cupping -flawed -##thesis -bertha -pleistocene -puddle -swearing -##nob -##tically -fleeting -prostate -amulet -educating -##mined -##tler -75th -jens -respondents -cavaliers -papacy -raju -##iente -##ulum -##tip -funnel -disneyland -##lley -sociologist -##iam -faulkner -louvre -menon -##dson -##ower -afterlife -mannheim -peptide -referees -comedians -meaningless -##anger -##laise -fabrics -hurley -renal -sleeps -##bour -##icle -breakout -kristin -roadside -animator -clover -disdain -unsafe -redesign -##urity -firth -barnsley -portage -reset -narrows -commandos -expansive -speechless -tubular -essendon -eyelashes -smashwords -##yad -##bang -##claim -craved -sprinted -chet -somme -astor -wrocław -orton -bane -##erving -##uing -mischief -##amps -##sund -scaling -terre -##xious -impairment -offenses -undermine -moi -soy -contiguous -arcadia -inuit -seam -##tops -macbeth -rebelled -##icative -##iot -elaborated -frs -uniformed -##dberg -powerless -priscilla -stimulated -qc -arboretum -frustrating -trieste -bullock -##nified -enriched -glistening -intern -##adia -locus -nouvelle -ollie -ike -lash -starboard -tapestry -headlined -hove -rigged -##vite -pollock -##yme -thrive -clustered -cas -roi -gleamed -olympiad -##lino -pressured -regimes -##hosis -##lick -ripley -##ophone -kickoff -gallon -rockwell -##arable -crusader -glue -revolutions -scrambling -1714 -grover -##jure -englishman -aztec -contemplating -coven -preach -triumphant -tufts -##esian -rotational -##phus -falkland -##brates -strewn -clarissa -rejoin -environmentally -glint -banded -drenched -moat -albanians -johor -rr -maestro -malley -nouveau -shaded -taxonomy -adhere -bunk -airfields -##ritan -1741 -encompass -remington -tran -##erative -amelie -mazda -friar -morals -passions -##zai -breadth -vis -##hae -argus -burnham -caressing -insider -rudd -##imov -##rso -italianate -murderous -textual -wainwright -armada -bam -weave -timer -##taken -##nh -fra -##crest -ardent -salazar -taps -tunis -##ntino -allegro -gland -philanthropic -##chester -implication -##optera -esq -judas -noticeably -wynn -##dara -inched -indexed -crises -villiers -bandit -royalties -patterned -cupboard -interspersed -accessory -isla -kendrick -entourage -stitches -##esthesia -headwaters -##ior -interlude -distraught -draught -1727 -##basket -biased -sy -transient -triad -subgenus -adapting -kidd -shortstop -##umatic -dimly -spiked -mcleod -reprint -nellie -pretoria -windmill -##cek -singled -##mps -reunite -##orous -bankers -outlying -##omp -##ports -##tream -apologies -cosmetics -patsy -##deh -##ocks -##yson -bender -nantes -serene -##nad -lucha -mmm -##cius -##gli -cmll -coinage -nestor -juarez -##rook -smeared -sprayed -twitching -sterile -irina -embodied -juveniles -enveloped -miscellaneous -cancers -dq -gulped -luisa -crested -swat -donegal -ref -##anov -##acker -hearst -mercantile -##lika -doorbell -vicki -##alla -##som -bilbao -psychologists -stryker -sw -horsemen -turkmenistan -wits -##national -anson -mathew -screenings -##umb -rihanna -##agne -##nessy -aisles -##iani -##osphere -hines -kenton -saskatoon -tasha -truncated -##champ -##itan -mildred -advises -fredrik -interpreting -inhibitors -##athi -spectroscopy -##hab -##kong -karim -panda -##oia -##nail -conqueror -kgb -leukemia -##dity -arrivals -cheered -pisa -phosphorus -shielded -##riated -mammal -unitarian -urgently -chopin -sanitary -##mission -spicy -drugged -hinges -##tort -tipping -trier -impoverished -westchester -##caster -epoch -nonstop -##gman -##khov -aromatic -centrally -cerro -##tively -##vio -billions -modulation -sedimentary -facilitating -outrageous -goldstein -##eak -##kt -ld -maitland -penultimate -pollard -##dance -fleets -spaceship -vertebrae -##nig -alcoholism -als -recital -##bham -##omics -##bm -trois -##tropical -commemorates -##meric -marge -##raction -1643 -cosmetic -ravaged -##ige -catastrophe -eng -##shida -albrecht -arterial -bellamy -decor -harmon -##rde -bulbs -synchronized -vito -easiest -shetland -shielding -wnba -##glers -##ssar -##riam -brianna -cumbria -##aceous -##rard -cores -thayer -##nsk -brood -hilltop -luminous -carts -keynote -larkin -logos -##cta -##mund -##quay -lilith -tinted -wrestle -mobilization -##uses -sequential -siam -bloomfield -takahashi -##ieving -presenters -ringo -blazed -witty -##oven -##ignant -devastation -haydn -harmed -newt -therese -##peed -gershwin -molina -rabbis -sudanese -innate -restarted -##sack -##fus -slices -wb -##shah -enroll -hypothetical -hysterical -1743 -fabio -indefinite -warped -exchanging -unsuitable -##sboro -gallo -1603 -bret -cobalt -homemade -##hunter -operatives -##dhar -terraces -durable -latch -pens -whorls -##ctuated -##eaux -billing -ligament -succumbed -##gly -regulators -spawn -##brick -##stead -filmfare -rochelle -##nzo -1725 -circumstance -saber -supplements -##nsky -##tson -crowe -wellesley -carrot -##9th -##movable -primate -drury -sincerely -topical -##mad -##rao -callahan -kyiv -smarter -tits -undo -##yeh -announcements -anthologies -barrio -nebula -##islaus -##shaft -##tyn -bodyguards -assassinate -barns -emmett -scully -##yd -##eland -##tino -##itarian -demoted -gorman -lashed -prized -adventist -writ -##gui -alla -invertebrates -##ausen -1641 -amman -1742 -align -healy -redistribution -##gf -##rize -insulation -##drop -adherents -hezbollah -vitro -ferns -yanking -registering -uppsala -cheerleading -confines -mischievous -tully -##ross -49th -docked -roam -stipulated -pumpkin -##bry -prompt -##ezer -blindly -shuddering -craftsmen -frail -scented -katharine -scramble -shaggy -sponge -helix -zaragoza -43rd -backlash -fontaine -seizures -posse -cowan -nonfiction -telenovela -wwii -hammered -undone -##gpur -encircled -irs -##ivation -artefacts -oneself -searing -smallpox -##belle -##osaurus -shandong -breached -upland -blushing -rankin -infinitely -psyche -tolerated -docking -evicted -##col -unmarked -##lving -gnome -lettering -litres -musique -##oint -benevolent -##jal -blackened -##anna -mccall -racers -tingle -##ocene -##orestation -introductions -radically -##hiff -##باد -1610 -1739 -munchen -plead -##nka -condo -scissors -##sight -##tens -apprehension -##cey -##yin -hallmark -watering -formulas -sequels -##llas -aggravated -bae -commencing -##building -enfield -prohibits -marne -vedic -civilized -euclidean -jagger -beforehand -blasts -dumont -##arney -##nem -conversions -hierarchical -rios -simulator -##dya -##lellan -hedges -oleg -thrusts -shadowed -darby -maximize -1744 -gregorian -##nded -##routed -sham -unspecified -##hog -emory -factual -##smo -fooled -##rger -ortega -wellness -marlon -##oton -##urance -casket -keating -ley -enclave -##ayan -char -influencing -jia -##chenko -ammonia -erebidae -incompatible -violins -cornered -##arat -grooves -astronauts -columbian -rampant -fabrication -kyushu -mahmud -vanish -##dern -mesopotamia -##lete -##rgen -caspian -kenji -pitted -##vered -grimace -roanoke -tchaikovsky -twinned -##analysis -##awan -xinjiang -arias -clemson -kazakh -sizable -1662 -##khand -##vard -plunge -tatum -vittorio -##nden -cholera -##dana -bracing -indifference -projectile -superliga -##chee -realises -upgrading -porte -retribution -##vies -nk -stil -##resses -ama -bureaucracy -blackberry -bosch -testosterone -collapses -greer -##pathic -ioc -fifties -malls -##erved -bao -baskets -adolescents -siegfried -##osity -##tosis -mantra -detecting -existent -fledgling -##cchi -dissatisfied -gan -telecommunication -mingled -sobbed -controversies -outdated -taxis -##raus -fright -slams -##lham -##fect -##tten -detectors -fetal -tanned -##uw -fray -goth -olympian -skipping -mandates -scratches -sheng -unspoken -hyundai -tracey -hotspur -restrictive -##buch -americana -mundo -##bari -burroughs -diva -vulcan -##6th -distinctions -thumping -##ngen -mikey -sheds -fide -rescues -springsteen -vested -valuation -##ece -##ely -pinnacle -rake -sylvie -##edo -almond -quivering -##irus -alteration -faltered -##wad -51st -hydra -ticked -##kato -recommends -##dicated -antigua -arjun -stagecoach -wilfred -trickle -pronouns -##pon -aryan -nighttime -##anian -gall -pea -stitch -##hei -leung -milos -##dini -eritrea -starved -snowfall -kant -parasitic -cot -discus -hana -strikers -appleton -kitchens -##erina -##partisan -##itha -##vius -disclose -metis -##channel -1701 -##vera -fitch -1735 -blooded -##tila -decimal -##tang -##bai -cyclones -eun -bottled -peas -pensacola -basha -bolivian -crabs -boil -lanterns -partridge -roofed -1645 -necks -##phila -opined -patting -##kla -##lland -chuckles -volta -whereupon -##nche -devout -euroleague -suicidal -##dee -inherently -involuntary -knitting -nasser -##hide -puppets -colourful -courageous -southend -stills -miraculous -hodgson -richer -rochdale -ethernet -greta -uniting -prism -umm -##haya -##itical -##utation -deterioration -pointe -prowess -##ropriation -lids -scranton -billings -subcontinent -##koff -##scope -brute -kellogg -psalms -degraded -##vez -stanisław -##ructured -ferreira -pun -astonishing -gunnar -##yat -arya -prc -gottfried -##tight -excursion -##ographer -dina -##quil -##nare -huffington -illustrious -wilbur -verandah -##zard -naacp -##odle -constructive -fjord -kade -##naud -generosity -thrilling -baseline -cayman -frankish -plastics -accommodations -zoological -##fting -cedric -qb -motorized -##dome -##otted -squealed -tackled -canucks -budgets -situ -asthma -dail -gabled -grasslands -whimpered -writhing -judgments -minnie -##carbon -bananas -grille -domes -monique -odin -maguire -markham -tierney -##estra -##chua -libel -poke -speedy -atrium -laval -notwithstanding -##edly -fai -kala -##sur -robb -##sma -listings -luz -supplementary -tianjin -##acing -enzo -jd -ric -scanner -croats -transcribed -arden -##hair -##raphy -##lver -seventies -staggering -alam -horticultural -hs -regression -timbers -blasting -##ounded -montagu -manipulating -##cit -catalytic -1550 -troopers -##meo -condemnation -fitzpatrick -##oire -##roved -inexperienced -1670 -castes -##lative -outing -dubois -flicking -quarrel -ste -learners -1625 -whistled -##class -classify -tariffs -temperament -folly -liszt -##yles -immersed -jordanian -ceasefire -apparel -extras -maru -fished -##bio -harta -stockport -assortment -craftsman -paralysis -transmitters -##cola -blindness -##wk -fatally -proficiency -solemnly -##orno -repairing -amore -groceries -ultraviolet -##chase -schoolhouse -##tua -resurgence -nailed -##otype -ruse -saliva -diagrams -##tructing -albans -rann -thirties -antennas -hilarious -cougars -paddington -stats -##eger -breakaway -reza -authorship -prohibiting -scoffed -##etz -##ttle -conscription -defected -trondheim -##fires -ivanov -keenan -##adan -##ciful -##fb -##slow -locating -##ials -##tford -cadiz -basalt -blankly -interned -rags -rattling -##tick -carpathian -reassured -bum -guildford -iss -staunch -##onga -astronomers -sera -sofie -emergencies -susquehanna -##heard -duc -mastery -vh1 -williamsburg -bayer -buckled -craving -##khan -##rdes -bloomington -##write -alton -barbecue -##bians -justine -##hri -##ndt -delightful -smartphone -newtown -photon -retrieval -peugeot -hissing -##monium -##orough -flavors -lighted -relaunched -tainted -##games -##lysis -anarchy -microscopic -hopping -adept -evade -evie -##beau -inhibit -sinn -adjustable -hurst -intuition -wilton -44th -lawful -lowlands -stockings -thierry -##dalen -##hila -##nai -fates -prank -maison -lobbied -provocative -1724 -utopia -##qual -carbonate -gujarati -purcell -##rford -curtiss -##mei -overgrown -arenas -mediation -swallows -##rnik -respectful -turnbull -##hedron -##hope -alyssa -ozone -##ʻi -ami -gestapo -johansson -snooker -canteen -cuff -declines -empathy -stigma -##ags -##raine -taxpayers -volga -##wright -##copic -lifespan -overcame -tattooed -enactment -giggles -##ador -##camp -barrington -bribe -obligatory -orbiting -peng -##enas -elusive -sucker -##vating -cong -hardship -empowered -anticipating -estrada -cryptic -greasy -detainees -planck -sudbury -plaid -dod -kayla -##ears -##vb -##zd -mortally -##hein -cognition -radha -liechtenstein -meade -richly -argyle -harpsichord -liberalism -trumpets -lauded -tyrant -salsa -tiled -lear -promoters -reused -slicing -trident -##chuk -##gami -##lka -cantor -checkpoint -##points -gaul -leger -mammalian -##tov -##aar -##schaft -doha -frenchman -nirvana -##vino -delgado -headlining -##eron -##iography -jug -tko -1649 -naga -intersections -benfica -nawab -##suka -ashford -gulp -##deck -##vill -##rug -brentford -frazier -pleasures -dunne -potsdam -shenzhen -dentistry -##tec -flanagan -##dorff -##hear -chorale -dinah -prem -quezon -##rogated -relinquished -sutra -terri -##pani -flaps -##rissa -poly -##rnet -homme -aback -##eki -linger -womb -##kson -##lewood -doorstep -orthodoxy -threaded -westfield -##rval -dioceses -fridays -subsided -##gata -loyalists -##biotic -##ettes -letterman -lunatic -prelate -tenderly -invariably -souza -thug -winslow -##otide -furlongs -gogh -jeopardy -##runa -pegasus -##umble -humiliated -standalone -tagged -##roller -freshmen -klan -##bright -attaining -initiating -transatlantic -logged -viz -##uance -1723 -combatants -intervening -stephane -chieftain -despised -grazed -cdc -galveston -godzilla -macro -simulate -##planes -parades -##esses -##ductive -##unes -equator -overdose -##cans -##hosh -##lifting -joshi -epstein -sonora -treacherous -aquatics -manchu -responsive -##sation -supervisory -##christ -##llins -##ibar -##balance -##uso -kimball -karlsruhe -mab -##emy -ignores -phonetic -spaghetti -almighty -danzig -rumbling -tombstone -designations -lured -outset -##felt -supermarkets -grupo -kei -kraft -susanna -##blood -comprehension -genealogy -##aghan -##verted -redding -##ythe -1722 -bowing -##pore -##roi -lest -sharpened -fulbright -valkyrie -sikhs -##unds -swans -bouquet -merritt -##tage -##venting -commuted -redhead -clerks -leasing -cesare -dea -hazy -##vances -fledged -greenfield -servicemen -##gical -armando -blackout -sagged -downloadable -intra -potion -pods -##4th -##mism -attendants -gambia -stale -##ntine -plump -asteroids -rediscovered -buds -flea -hive -##neas -1737 -classifications -debuts -##eles -olympus -scala -##eurs -##gno -##mute -hummed -sigismund -visuals -wiggled -await -pilasters -clench -sulfate -##ances -bellevue -enigma -trainee -snort -##sw -clouded -denim -##rank -churning -hartman -lodges -riches -sima -##missible -accountable -socrates -regulates -mueller -1702 -avoids -solids -himalayas -nutrient -pup -##jevic -squat -fades -nec -##lates -##pina -##rona -##ου -privateer -tequila -##gative -##mpton -hornet -immortals -##dou -asturias -cleansing -dario -##rries -##anta -etymology -servicing -zhejiang -##venor -##nx -horned -erasmus -rayon -relocating -£10 -##bags -escalated -promenade -stubble -2010s -artisans -axial -liquids -mora -sho -yoo -##tsky -bundles -oldies -##nally -notification -bastion -##ths -sparkle -##lved -1728 -leash -pathogen -highs -##hmi -immature -gonzaga -ignatius -mansions -monterrey -sweets -bryson -##loe -polled -regatta -brightest -pei -rosy -squid -hatfield -payroll -addict -meath -cornerback -heaviest -lodging -##mage -capcom -rippled -##sily -barnet -mayhem -ymca -snuggled -rousseau -##cute -blanchard -fragmented -leighton -chromosomes -risking -##strel -##utter -corinne -coyotes -cynical -hiroshi -yeomanry -##ractive -ebook -grading -mandela -plume -agustin -magdalene -##rkin -bea -femme -trafford -##coll -##lun -##tance -52nd -fourier -upton -##mental -camilla -gust -iihf -islamabad -longevity -##kala -feldman -netting -##rization -endeavour -foraging -mfa -orr -##open -greyish -contradiction -graz -##ruff -handicapped -marlene -tweed -oaxaca -spp -campos -miocene -pri -configured -cooks -pluto -cozy -pornographic -##entes -70th -fairness -glided -jonny -lynne -rounding -sired -##emon -##nist -remade -uncover -##mack -complied -lei -newsweek -##jured -##parts -##enting -##pg -finer -guerrillas -athenian -deng -disused -stepmother -accuse -gingerly -seduction -confronting -##going -gora -nostalgia -sabres -virginity -wrenched -##minated -syndication -wielding -eyre -##gnon -##igny -behaved -taxpayer -sweeps -##growth -childless -gallant -##ywood -amplified -geraldine -scrape -##ffi -babylonian -fresco -##rdan -##kney -##position -1718 -restricting -tack -fukuoka -osborn -selector -partnering -##dlow -kia -tak -whitley -gables -##mania -mri -softness -immersion -##bots -##evsky -1713 -chilling -insignificant -pcs -##uis -elites -lina -purported -supplemental -teaming -##americana -##dding -##inton -proficient -rouen -##nage -##rret -niccolo -selects -##bread -fluffy -1621 -gruff -knotted -mukherjee -polgara -thrash -nicholls -secluded -smoothing -thru -corsica -loaf -whitaker -inquiries -##rrier -##kam -indochina -marlins -myles -peking -##tea -extracts -pastry -superhuman -connacht -vogel -##ditional -##het -##udged -##lash -gloss -quarries -refit -teaser -##alic -##gaon -20s -materialized -sling -camped -pickering -tung -tracker -pursuant -##cide -cranes -##cini -##typical -##viere -anhalt -overboard -workout -chores -fares -orphaned -stains -##logie -fenton -surpassing -joyah -triggers -##itte -grandmaster -##lass -##lists -clapping -fraudulent -ledger -nagasaki -##cor -##nosis -##tsa -eucalyptus -tun -##icio -##rney -##tara -dax -heroism -ina -wrexham -onboard -unsigned -##dates -moshe -galley -winnie -droplets -exiles -praises -watered -noodles -##aia -fein -leland -multicultural -stink -bingo -comets -erskine -modernized -canned -constraint -domestically -chemotherapy -featherweight -stifled -##mum -darkly -irresistible -refreshing -hasty -isolate -##oys -kitchener -planners -##wehr -cages -yarn -implant -toulon -elects -childbirth -yue -##lind -rightful -sportsman -junctions -remodeled -specifies -##rgh -##oons -complimented -##urgent -lister -ot -##logic -bequeathed -cheekbones -fontana -gabby -##dial -amadeus -corrugated -maverick -resented -triangles -##hered -##usly -nazareth -tyrol -1675 -assent -poorer -sectional -aegean -##cous -nylon -ghanaian -##egorical -##weig -cushions -forbid -fusiliers -obstruction -somerville -##scia -dime -earrings -elliptical -leyte -oder -polymers -timmy -midtown -piloted -settles -continual -externally -mayfield -##uh -enrichment -henson -keane -persians -1733 -benji -braden -pep -##efe -contenders -pepsi -valet -##isches -##asse -##earing -goofy -stroll -##amen -authoritarian -occurrences -adversary -ahmedabad -tangent -toppled -dorchester -1672 -modernism -marxism -islamist -charlemagne -exponential -racks -brunette -pic -skirmish -##bund -##lad -##powered -##yst -hoisted -messina -shatter -##ctum -jedi -vantage -##music -##neil -clemens -mahmoud -corrupted -authentication -lowry -nils -##washed -omnibus -wounding -jillian -##itors -##opped -serialized -narcotics -handheld -##arm -##plicity -intersecting -stimulating -##onis -crate -fellowships -hemingway -casinos -climatic -fordham -copeland -drip -beatty -leaflets -robber -brothel -madeira -##hedral -sphinx -ultrasound -##vana -valor -forbade -leonid -villas -##aldo -duane -marquez -##cytes -disadvantaged -forearms -kawasaki -reacts -consular -lax -uncles -uphold -##hopper -concepcion -dorsey -lass -##izan -arching -passageway -1708 -researches -tia -internationals -##graphs -##opers -distinguishes -javanese -divert -##uven -plotted -##listic -##rwin -##erik -##tify -affirmative -signifies -validation -##bson -kari -felicity -georgina -zulu -##eros -##rained -##rath -overcoming -argyll -##rbin -1734 -chiba -ratification -windy -earls -parapet -##marks -hunan -pristine -astrid -punta -##gart -brodie -##kota -##oder -malaga -minerva -rouse -##phonic -bellowed -pagoda -portals -reclamation -##gur -##odies -##⁄₄ -parentheses -quoting -allergic -palette -showcases -benefactor -heartland -nonlinear -##tness -bladed -cheerfully -scans -##ety -1666 -girlfriends -pedersen -hiram -sous -##liche -##nator -1683 -##nery -##orio -##umen -bobo -primaries -smiley -##cb -unearthed -uniformly -fis -metadata -1635 -ind -##oted -recoil -##titles -##tura -##ια -hilbert -jamestown -mcmillan -tulane -seychelles -##frid -antics -coli -fated -stucco -##grants -1654 -bulky -accolades -arrays -caledonian -carnage -optimism -puebla -##tative -##cave -enforcing -rotherham -dunlop -aeronautics -chimed -incline -zoning -archduke -hellenistic -##oses -##sions -candi -thong -##ople -magnate -rustic -##rsk -projective -slant -##offs -danes -hollis -vocalists -##ammed -congenital -contend -gesellschaft -##ocating -##pressive -douglass -quieter -##kshi -howled -salim -spontaneously -townsville -buena -southport -##bold -kato -1638 -faerie -stiffly -##vus -##rled -flawless -realising -taboo -##7th -straightening -jena -##hid -cartwright -berber -bertram -soloists -noses -coping -fission -hardin -inca -##cen -1717 -mobilized -vhf -##raf -biscuits -curate -##anial -gaunt -neighbourhoods -1540 -##abas -blanca -bypassed -sockets -behold -coincidentally -##bane -nara -shave -splinter -terrific -##arion -##erian -commonplace -juris -redwood -waistband -boxed -caitlin -fingerprints -jennie -naturalized -##ired -balfour -craters -jody -bungalow -hugely -quilt -glitter -pigeons -undertaker -bulging -constrained -##sil -##akh -assimilation -reworked -##person -persuasion -##pants -felicia -##cliff -##ulent -1732 -explodes -##dun -##inium -##zic -lyman -vulture -hog -overlook -begs -northwards -ow -spoil -##urer -fatima -favorably -accumulate -sargent -sorority -corresponded -dispersal -kochi -toned -##imi -##lita -internacional -newfound -##agger -##lynn -##rigue -booths -peanuts -##eborg -medicare -muriel -nur -##uram -crates -millennia -pajamas -worsened -##breakers -jimi -vanuatu -yawned -##udeau -carousel -##hony -hurdle -##ccus -##mounted -##pod -rv -##eche -airship -ambiguity -compulsion -recapture -##claiming -arthritis -##osomal -1667 -asserting -ngc -sniffing -dade -discontent -glendale -ported -##amina -defamation -rammed -##scent -fling -livingstone -##fleet -875 -apocalyptic -comrade -##lowe -cessna -eine -persecuted -subsistence -demi -hoop -reliefs -coptic -progressing -stemmed -perpetrators -1665 -priestess -##nio -dobson -ebony -rooster -itf -tortricidae -##bbon -##jian -cleanup -##jean -##øy -1721 -eighties -taxonomic -holiness -##hearted -##spar -antilles -showcasing -stabilized -##nb -gia -mascara -michelangelo -dawned -##uria -##vinsky -extinguished -fitz -grotesque -£100 -##fera -##loid -##mous -barges -neue -throbbed -cipher -johnnie -##mpt -outburst -##swick -spearheaded -administrations -heartbreak -pixels -pleasantly -##enay -lombardy -plush -##nsed -bobbie -##hly -reapers -tremor -xiang -minogue -substantive -hitch -barak -##wyl -kwan -##encia -910 -obscene -elegance -indus -surfer -bribery -conserve -##hyllum -##masters -horatio -##fat -apes -rebound -psychotic -##pour -iteration -##mium -##vani -botanic -horribly -antiques -dispose -paxton -##hli -##wg -timeless -1704 -disregard -engraver -hounds -##bau -##version -looted -uno -facilitates -groans -masjid -rutland -antibody -disqualification -decatur -footballers -quake -slacks -48th -rein -scribe -stabilize -commits -exemplary -tho -##hort -##chison -pantry -traversed -##hiti -disrepair -identifiable -vibrated -baccalaureate -csa -interviewing -##iensis -##raße -greaves -wealthiest -classed -jogged -£5 -##atal -illuminating -knicks -respecting -##uno -scrubbed -##iji -##dles -kruger -moods -growls -raider -silvia -chefs -kam -cree -percival -##terol -gunter -counterattack -defiant -henan -ze -##rasia -##riety -equivalence -submissions -##fra -##thor -bautista -mechanically -##heater -cornice -herbal -templar -##mering -outputs -ruining -ligand -renumbered -extravagant -mika -blockbuster -eta -insurrection -##ilia -darkening -ferocious -pianos -strife -kinship -##aer -melee -##anor -##iste -##oue -decidedly -weep -##jad -##missive -##ppel -puget -unease -##gnant -1629 -hammering -kassel -wessex -##lga -bromwich -egan -paranoia -utilization -##atable -##idad -contradictory -provoke -##ols -##ouring -##tangled -knesset -##very -##lette -plumbing -##sden -greensboro -occult -sniff -zev -beaming -gamer -haggard -mahal -##olt -##pins -mendes -utmost -briefing -gunnery -##gut -##pher -##zh -##rok -1679 -khalifa -sonya -##boot -principals -urbana -wiring -##liffe -##minating -##rrado -dahl -nyu -skepticism -townspeople -ithaca -lobster -somethin -##fur -##arina -##−1 -freighter -zimmerman -biceps -contractual -##herton -amend -hurrying -subconscious -##anal -meng -clermont -spawning -##eia -##lub -dignitaries -impetus -snacks -spotting -twigs -##bilis -##cz -##ouk -libertadores -nic -skylar -##aina -gustave -asean -##anum -dieter -legislatures -flirt -bromley -trolls -umar -##bbies -##tyle -blah -parc -bridgeport -crank -negligence -##nction -46th -constantin -molded -bandages -seriousness -00pm -siegel -carpets -compartments -upbeat -statehood -##dner -##edging -marko -platt -##hane -paving -##iy -1738 -abbess -impatience -limousine -nbl -lucille -mojo -nightfall -robbers -##nais -karel -brisk -calves -replicate -ascribed -telescopes -##olf -intimidated -ballast -specialization -aerodynamic -caliphate -visionary -##arded -epsilon -##aday -##onte -aggregation -auditory -boosted -reunification -kathmandu -loco -robyn -acknowledges -appointing -humanoid -newell -redeveloped -restraints -##tained -barbarians -chopper -1609 -italiana -##lez -##lho -investigates -wrestlemania -##anies -##bib -##falls -creaked -dragoons -gravely -minions -stupidity -volley -##harat -##week -musik -##eries -##uously -fungal -massimo -semantics -malvern -##ahl -##pee -discourage -embryo -imperialism -1910s -profoundly -##ddled -jiangsu -sparkled -stat -##holz -sweatshirt -tobin -##iction -sneered -##cheon -##oit -brit -causal -smyth -##neuve -diffuse -perrin -silvio -##ipes -##recht -detonated -iqbal -selma -##nism -##zumi -roasted -##riders -tay -##ados -##mament -##mut -##rud -completes -nipples -flavour -hirsch -##laus -calderon -sneakers -moravian -##ksha -1622 -##imeters -bodo -##isance -##pre -##ronia -anatomical -excerpt -##lke -dh -kunst -##tablished -##scoe -biomass -panted -unharmed -gael -housemates -montpellier -coa -rodents -tonic -hickory -singleton -##taro -1719 -aldo -breaststroke -dempsey -och -rocco -##cuit -merton -dissemination -midsummer -serials -##idi -haji -polynomials -enoch -prematurely -shutter -taunton -£3 -##grating -##inates -archangel -harassed -##asco -archway -dazzling -##ecin -1736 -sumo -wat -##kovich -1086 -honneur -##ently -##nostic -##ttal -##idon -1605 -1716 -rents -##gnan -hires -##ikh -##dant -howie -##rons -handler -retracted -shocks -1632 -arun -duluth -kepler -trumpeter -##lary -peeking -seasoned -trooper -##mara -laszlo -##iciencies -##rti -heterosexual -##inatory -indira -jogging -##inga -##lism -beit -dissatisfaction -malice -##ately -nedra -peeling -##rgeon -47th -stadiums -vertigo -##ains -iced -restroom -##plify -##tub -illustrating -pear -##chner -##sibility -inorganic -rappers -receipts -watery -##kura -lucinda -##oulos -reintroduced -##8th -##tched -gracefully -saxons -nutritional -wastewater -rained -favourites -bedrock -fisted -hallways -likeness -upscale -##lateral -1580 -blinds -prequel -##pps -##tama -deter -humiliating -restraining -tn -vents -1659 -laundering -recess -rosary -tractors -coulter -federer -##ifiers -##plin -persistence -##quitable -geschichte -pendulum -quakers -##beam -bassett -pictorial -koln -##sitor -drills -reciprocal -shooters -##cton -##tees -converge -pip -dmitri -donnelly -yamamoto -aqua -azores -demographics -hypnotic -spitfire -suspend -wryly -roderick -##rran -sebastien -##asurable -mavericks -##fles -himalayan -prodigy -##iance -transvaal -demonstrators -handcuffs -dodged -mcnamara -sublime -1726 -crazed -##efined -##till -ivo -pondered -reconciled -shrill -sava -##duk -bal -heresy -jaipur -goran -##nished -lux -shelly -whitehall -##hre -israelis -peacekeeping -##wled -1703 -demetrius -ousted -##arians -##zos -beale -anwar -backstroke -raged -shrinking -cremated -##yck -benign -towing -wadi -darmstadt -landfill -parana -soothe -colleen -sidewalks -mayfair -tumble -hepatitis -ferrer -superstructure -##gingly -##urse -##wee -anthropological -translators -##mies -closeness -hooves -##pw -mondays -##roll -##vita -landscaping -##urized -purification -sock -thorns -thwarted -jalan -tiberius -##taka -saline -##rito -confidently -khyber -sculptors -##ij -brahms -hammersmith -inspectors -battista -fivb -fragmentation -hackney -##uls -arresting -exercising -antoinette -bedfordshire -##zily -dyed -##hema -1656 -racetrack -variability -##tique -1655 -austrians -deteriorating -madman -theorists -aix -lehman -weathered -1731 -decreed -eruptions -1729 -flaw -quinlan -sorbonne -flutes -nunez -1711 -adored -downwards -fable -rasped -1712 -moritz -mouthful -renegade -shivers -stunts -dysfunction -restrain -translit -pancakes -##avio -##cision -##tray -vial -##lden -bain -##maid -##oxide -chihuahua -malacca -vimes -##rba -##rnier -1664 -donnie -plaques -##ually -bangs -floppy -huntsville -loretta -nikolay -##otte -eater -handgun -ubiquitous -##hett -eras -zodiac -1634 -##omorphic -1820s -##zog -cochran -##bula -##lithic -warring -##rada -dalai -excused -blazers -mcconnell -reeling -este -##abi -geese -hoax -taxon -##bla -guitarists -condemning -hunts -inversion -moffat -taekwondo -##lvis -1624 -stammered -##rest -##rzy -sousa -fundraiser -marylebone -navigable -uptown -cabbage -daniela -salman -shitty -whimper -##kian -##utive -programmers -protections -##rmi -##rued -forceful -##enes -fuss -##tao -##wash -brat -oppressive -reykjavik -spartak -ticking -##inkles -##kiewicz -adolph -horst -maui -protege -straighten -cpc -landau -concourse -clements -resultant -##ando -imaginative -joo -reactivated -##rem -##ffled -##uising -consultative -##guide -flop -kaitlyn -mergers -parenting -somber -##vron -supervise -vidhan -##imum -courtship -exemplified -harmonies -medallist -refining -##rrow -##ка -amara -##hum -goalscorer -sited -overshadowed -rohan -displeasure -secretive -multiplied -osman -##orth -engravings -padre -##kali -##veda -miniatures -mis -##yala -clap -pali -rook -##cana -1692 -57th -antennae -astro -oskar -1628 -bulldog -crotch -hackett -yucatan -##sure -amplifiers -brno -ferrara -migrating -##gree -thanking -turing -##eza -mccann -ting -andersson -onslaught -gaines -ganga -incense -standardization -##mation -sentai -scuba -stuffing -turquoise -waivers -alloys -##vitt -regaining -vaults -##clops -##gizing -digger -furry -memorabilia -probing -##iad -payton -rec -deutschland -filippo -opaque -seamen -zenith -afrikaans -##filtration -disciplined -inspirational -##merie -banco -confuse -grafton -tod -##dgets -championed -simi -anomaly -biplane -##ceptive -electrode -##para -1697 -cleavage -crossbow -swirl -informant -##lars -##osta -afi -bonfire -spec -##oux -lakeside -slump -##culus -##lais -##qvist -##rrigan -1016 -facades -borg -inwardly -cervical -pointedly -stabilization -##odon -chests -1699 -hacked -ctv -orthogonal -suzy -##lastic -gaulle -jacobite -rearview -##erted -ashby -##drik -##igate -##mise -##zbek -affectionately -canine -disperse -latham -##istles -##ivar -spielberg -##orin -##idium -ezekiel -cid -##sg -durga -middletown -##cina -customized -frontiers -harden -##etano -##zzy -1604 -bolsheviks -coloration -yoko -##bedo -briefs -slabs -debra -liquidation -plumage -##oin -blossoms -dementia -subsidy -1611 -proctor -relational -jerseys -parochial -ter -##ici -esa -peshawar -cavalier -loren -idiots -shamrock -1646 -dutton -malabar -mustache -##endez -##ocytes -referencing -terminates -marche -yarmouth -##sop -acton -mated -seton -subtly -baptised -beige -extremes -jolted -kristina -telecast -##actic -safeguard -waldo -##baldi -##bular -endeavors -sloppy -subterranean -##ensburg -##itung -delicately -pigment -tq -##scu -1626 -collisions -coveted -herds -##personal -##meister -##nberger -chopra -##ricting -abnormalities -defective -galician -lucie -##dilly -alligator -likened -##genase -burundi -clears -complexion -derelict -deafening -diablo -fingered -champaign -dogg -enlist -isotope -labeling -mrna -##erre -brilliance -marvelous -##ayo -1652 -crawley -ether -footed -dwellers -deserts -hamish -rubs -warlock -skimmed -##lizer -buick -embark -heraldic -irregularities -##ajan -kiara -##kulam -##ieg -antigen -kowalski -##lge -oakley -visitation -##mbit -vt -##suit -1570 -murderers -##miento -##rites -chimneys -##sling -condemn -custer -exchequer -havre -##ghi -fluctuations -##rations -dfb -hendricks -vaccines -##tarian -nietzsche -biking -juicy -##duced -brooding -scrolling -selangor -##ragan -annum -boomed -seminole -sugarcane -##dna -departmental -dismissing -innsbruck -arteries -ashok -batavia -daze -kun -overtook -##rga -##tlan -beheaded -gaddafi -holm -electronically -faulty -galilee -fractures -kobayashi -##lized -gunmen -magma -aramaic -mala -eastenders -inference -messengers -bf -##qu -bathrooms -##vere -1658 -flashbacks -ideally -misunderstood -##jali -##weather -mendez -##grounds -uncanny -##iii -1709 -friendships -##nbc -sacrament -accommodated -reiterated -logistical -pebbles -thumped -##escence -administering -decrees -drafts -##flight -##cased -##tula -futuristic -picket -intimidation -winthrop -##fahan -interfered -afar -francoise -morally -uta -cochin -croft -dwarfs -##bruck -##dents -##nami -biker -##hner -##meral -##isen -##ometric -##pres -##ан -brightened -meek -parcels -securely -gunners -##jhl -##zko -agile -hysteria -##lten -##rcus -bukit -champs -chevy -cuckoo -leith -sadler -theologians -welded -##section -1663 -plurality -xander -##rooms -##formed -shredded -temps -intimately -pau -tormented -##lok -##stellar -1618 -charred -essen -##mmel -alarms -spraying -ascot -blooms -twinkle -##abia -##apes -internment -obsidian -##chaft -snoop -##dav -##ooping -malibu -##tension -quiver -##itia -hays -mcintosh -travers -walsall -##ffie -1623 -beverley -schwarz -plunging -structurally -rosenthal -vikram -##tsk -ghz -##onda -##tiv -chalmers -groningen -pew -reckon -unicef -##rvis -55th -##gni -1651 -sulawesi -avila -cai -metaphysical -screwing -turbulence -##mberg -augusto -samba -56th -baffled -momentary -toxin -##urian -##wani -aachen -condoms -dali -steppe -##oed -##year -adolescence -dauphin -electrically -inaccessible -microscopy -nikita -##ega -atv -##enter -##oles -##oteric -accountants -punishments -wrongly -bribes -adventurous -clinch -flinders -southland -##hem -##kata -gough -##ciency -lads -soared -##ה -undergoes -deformation -outlawed -rubbish -##arus -##mussen -##nidae -##rzburg -arcs -##ingdon -##tituted -1695 -wheelbase -wheeling -bombardier -campground -zebra -##lices -##oj -##bain -lullaby -##ecure -donetsk -wylie -grenada -##arding -##ης -squinting -eireann -opposes -##andra -maximal -runes -##broken -##cuting -##iface -##ror -##rosis -additive -britney -adultery -triggering -##drome -detrimental -aarhus -containment -jc -swapped -vichy -##ioms -madly -##oric -##rag -brant -##ckey -1560 -1612 -broughton -rustling -##stems -##uder -asbestos -mentoring -##nivorous -finley -leaps -##isan -apical -pry -slits -substitutes -##dict -intuitive -fantasia -insistent -unreasonable -##igen -##vna -domed -hannover -margot -ponder -##zziness -impromptu -jian -rampage -stemming -##eft -andrey -gerais -whichever -amnesia -appropriated -anzac -clicks -modifying -ultimatum -cambrian -maids -verve -yellowstone -##mbs -conservatoire -##scribe -adherence -dinners -spectra -imperfect -mysteriously -sidekick -tatar -tuba -##aks -##ifolia -distrust -##athan -##zle -ronin -zac -##pse -celaena -instrumentalist -scents -skopje -##mbling -comical -compensated -vidal -condor -intersect -jingle -wavelengths -##urrent -mcqueen -##izzly -carp -weasel -militias -postdoctoral -eugen -gunslinger -##ɛ -faux -hospice -##for -appalled -derivation -dwarves -##elis -dilapidated -##folk -astoria -philology -##lwyn -##otho -##saka -inducing -philanthropy -##bf -##itative -geek -markedly -##yce -bessie -indices -##flict -frowns -resolving -weightlifting -tugs -cleric -contentious -1653 -mania -rms -##miya -##reate -##ruck -##tucket -bien -eels -marek -##ayton -##cence -discreet -unofficially -##ife -leaks -##bber -1705 -dung -compressor -hillsborough -pandit -shillings -distal -##skin -##tat -nosed -##nir -mangrove -undeveloped -##idia -textures -##inho -##rise -irritating -nay -amazingly -bancroft -apologetic -compassionate -kata -symphonies -##lovic -airspace -##lch -gifford -precautions -fulfillment -sevilla -vulgar -martinique -##urities -looting -piccolo -tidy -##dermott -quadrant -armchair -incomes -mathematicians -stampede -nilsson -##inking -##scan -foo -quarterfinal -##ostal -shang -shouldered -squirrels -##owe -vinegar -##bner -##rchy -##systems -delaying -##trics -ars -dwyer -rhapsody -sponsoring -##gration -bipolar -cinder -starters -##olio -##urst -signage -##nty -aground -figurative -mons -acquaintances -duets -erroneously -soyuz -elliptic -recreated -##cultural -##quette -##ssed -##tma -##zcz -moderator -scares -##itaire -##stones -##udence -juniper -sighting -##just -##nsen -britten -calabria -ry -bop -cramer -forsyth -stillness -airmen -gathers -unfit -##umber -##upt -taunting -seeker -streamlined -##bution -holster -schumann -tread -vox -##gano -##onzo -strive -dil -reforming -covent -newbury -predicting -##orro -decorate -tre -##puted -andover -asahi -dept -dunkirk -gills -##tori -buren -huskies -##stis -##stov -abstracts -bets -loosen -##opa -1682 -yearning -##glio -##sir -berman -effortlessly -enamel -napoli -persist -##peration -##uez -attache -elisa -invitations -##kic -accelerating -reindeer -boardwalk -clutches -nelly -polka -##kei -adamant -huey -lough -unbroken -adventurer -embroidery -inspecting -stanza -##ducted -naia -taluka -##pone -##roids -chases -deprivation -florian -##ppet -earthly -##lib -##ssee -colossal -foreigner -vet -freaks -patrice -rosewood -triassic -upstate -##pkins -dominates -ata -chants -ks -vo -##bley -##raya -##rmed -agra -infiltrate -##ailing -##ilation -##tzer -##uppe -##werk -binoculars -enthusiast -fujian -squeak -##avs -abolitionist -almeida -boredom -hampstead -marsden -rations -##ands -inflated -bonuses -rosalie -patna -##rco -detachments -penitentiary -54th -flourishing -woolf -##dion -##etched -papyrus -##lster -##nsor -##toy -bobbed -dismounted -endelle -inhuman -motorola -wince -wreath -##ticus -hideout -inspections -sanjay -disgrace -infused -pudding -stalks -##urbed -arsenic -leases -##hyl -##rrard -collarbone -##waite -##wil -dowry -##bant -##edance -genealogical -nitrate -salamanca -scandals -thyroid -necessitated -##` -##¡ -##¢ -##¦ -##¨ -##ª -##¬ -##´ -##¶ -##¾ -##¿ -##ð -##þ -##ħ -##œ -##ƒ -##ɐ -##ɑ -##ɒ -##ɕ -##ɣ -##ɨ -##ɪ -##ɫ -##ɬ -##ɯ -##ɲ -##ɴ -##ɹ -##ɾ -##ʀ -##ʁ -##ʂ -##ʃ -##ʉ -##ʊ -##ʋ -##ʌ -##ʎ -##ʐ -##ʑ -##ʒ -##ʔ -##ʲ -##ʳ -##ʷ -##ʸ -##ʻ -##ʼ -##ʾ -##ʿ -##ˡ -##ˣ -##ˤ -##ζ -##ξ -##щ -##ъ -##э -##ю -##ђ -##є -##ј -##љ -##њ -##ћ -##ӏ -##ա -##բ -##գ -##դ -##ե -##թ -##ի -##լ -##կ -##հ -##մ -##յ -##ն -##ո -##պ -##ս -##վ -##տ -##ր -##ւ -##ք -##־ -##א -##ב -##ג -##ד -##ו -##ז -##ח -##ט -##י -##ך -##כ -##ל -##ם -##מ -##ן -##נ -##ס -##ע -##ף -##פ -##ץ -##צ -##ק -##ר -##ש -##ת -##، -##ء -##ث -##ج -##ح -##خ -##ذ -##ز -##ش -##ص -##ض -##ط -##ظ -##غ -##ـ -##ف -##ق -##ك -##ى -##ٹ -##پ -##چ -##ک -##گ -##ں -##ھ -##ہ -##ے -##अ -##आ -##उ -##ए -##क -##ख -##ग -##च -##ज -##ट -##ड -##ण -##त -##थ -##द -##ध -##न -##प -##ब -##भ -##म -##य -##र -##ल -##व -##श -##ष -##स -##ह -##ा -##ि -##ी -##ो -##। -##॥ -##ং -##অ -##আ -##ই -##উ -##এ -##ও -##ক -##খ -##গ -##চ -##ছ -##জ -##ট -##ড -##ণ -##ত -##থ -##দ -##ধ -##ন -##প -##ব -##ভ -##ম -##য -##র -##ল -##শ -##ষ -##স -##হ -##া -##ি -##ী -##ে -##க -##ச -##ட -##த -##ந -##ன -##ப -##ம -##ய -##ர -##ல -##ள -##வ -##ா -##ி -##ு -##ே -##ை -##ನ -##ರ -##ಾ -##ක -##ය -##ර -##ල -##ව -##ා -##ต -##ท -##พ -##ล -##ว -##ส -##། -##ག -##ང -##ད -##ན -##པ -##བ -##མ -##འ -##ར -##ལ -##ས -##မ -##ა -##ბ -##გ -##დ -##ე -##ვ -##თ -##ი -##კ -##ლ -##მ -##ნ -##ო -##რ -##ს -##ტ -##უ -##ᄊ -##ᴬ -##ᴮ -##ᴰ -##ᴵ -##ᴺ -##ᵀ -##ᵇ -##ᵈ -##ᵖ -##ᵗ -##ᵣ -##ᵤ -##ᵥ -##ᶜ -##ᶠ -##‐ -##‑ -##‒ -##– -##— -##― -##‘ -##’ -##‚ -##“ -##” -##‡ -##… -##⁰ -##⁴ -##⁵ -##⁶ -##⁷ -##⁸ -##⁹ -##⁻ -##₅ -##₆ -##₇ -##₈ -##₉ -##₊ -##₍ -##₎ -##ₐ -##ₑ -##ₒ -##ₓ -##ₕ -##ₖ -##ₗ -##ₘ -##ₚ -##ₛ -##ₜ -##₤ -##₩ -##₱ -##₹ -##ℓ -##ℝ -##⅓ -##⅔ -##↦ -##⇄ -##⇌ -##∂ -##∅ -##∆ -##∇ -##∈ -##∗ -##∘ -##∧ -##∨ -##∪ -##⊂ -##⊆ -##⊕ -##⊗ -##☉ -##♯ -##⟨ -##⟩ -##ⱼ -##⺩ -##⺼ -##⽥ -##亻 -##宀 -##彳 -##忄 -##扌 -##氵 -##疒 -##糹 -##訁 -##辶 -##阝 -##龸 -##fi -##fl diff --git a/PixArt/LICENSE-PixArt b/PixArt/LICENSE-PixArt deleted file mode 100644 index 0ad25db..0000000 --- a/PixArt/LICENSE-PixArt +++ /dev/null @@ -1,661 +0,0 @@ - GNU AFFERO GENERAL PUBLIC LICENSE - Version 3, 19 November 2007 - - Copyright (C) 2007 Free Software Foundation, Inc. - Everyone is permitted to copy and distribute verbatim copies - of this license document, but changing it is not allowed. - - Preamble - - The GNU Affero General Public License is a free, copyleft license for -software and other kinds of works, specifically designed to ensure -cooperation with the community in the case of network server software. - - The licenses for most software and other practical works are designed -to take away your freedom to share and change the works. By contrast, -our General Public Licenses are intended to guarantee your freedom to -share and change all versions of a program--to make sure it remains free -software for all its users. - - When we speak of free software, we are referring to freedom, not -price. Our General Public Licenses are designed to make sure that you -have the freedom to distribute copies of free software (and charge for -them if you wish), that you receive source code or can get it if you -want it, that you can change the software or use pieces of it in new -free programs, and that you know you can do these things. - - Developers that use our General Public Licenses protect your rights -with two steps: (1) assert copyright on the software, and (2) offer -you this License which gives you legal permission to copy, distribute -and/or modify the software. - - A secondary benefit of defending all users' freedom is that -improvements made in alternate versions of the program, if they -receive widespread use, become available for other developers to -incorporate. Many developers of free software are heartened and -encouraged by the resulting cooperation. However, in the case of -software used on network servers, this result may fail to come about. -The GNU General Public License permits making a modified version and -letting the public access it on a server without ever releasing its -source code to the public. - - The GNU Affero General Public License is designed specifically to -ensure that, in such cases, the modified source code becomes available -to the community. It requires the operator of a network server to -provide the source code of the modified version running there to the -users of that server. Therefore, public use of a modified version, on -a publicly accessible server, gives the public access to the source -code of the modified version. - - An older license, called the Affero General Public License and -published by Affero, was designed to accomplish similar goals. This is -a different license, not a version of the Affero GPL, but Affero has -released a new version of the Affero GPL which permits relicensing under -this license. - - The precise terms and conditions for copying, distribution and -modification follow. - - TERMS AND CONDITIONS - - 0. Definitions. - - "This License" refers to version 3 of the GNU Affero General Public License. - - "Copyright" also means copyright-like laws that apply to other kinds of -works, such as semiconductor masks. - - "The Program" refers to any copyrightable work licensed under this -License. Each licensee is addressed as "you". "Licensees" and -"recipients" may be individuals or organizations. - - To "modify" a work means to copy from or adapt all or part of the work -in a fashion requiring copyright permission, other than the making of an -exact copy. The resulting work is called a "modified version" of the -earlier work or a work "based on" the earlier work. - - A "covered work" means either the unmodified Program or a work based -on the Program. - - To "propagate" a work means to do anything with it that, without -permission, would make you directly or secondarily liable for -infringement under applicable copyright law, except executing it on a -computer or modifying a private copy. Propagation includes copying, -distribution (with or without modification), making available to the -public, and in some countries other activities as well. - - To "convey" a work means any kind of propagation that enables other -parties to make or receive copies. Mere interaction with a user through -a computer network, with no transfer of a copy, is not conveying. - - An interactive user interface displays "Appropriate Legal Notices" -to the extent that it includes a convenient and prominently visible -feature that (1) displays an appropriate copyright notice, and (2) -tells the user that there is no warranty for the work (except to the -extent that warranties are provided), that licensees may convey the -work under this License, and how to view a copy of this License. If -the interface presents a list of user commands or options, such as a -menu, a prominent item in the list meets this criterion. - - 1. Source Code. - - The "source code" for a work means the preferred form of the work -for making modifications to it. "Object code" means any non-source -form of a work. - - A "Standard Interface" means an interface that either is an official -standard defined by a recognized standards body, or, in the case of -interfaces specified for a particular programming language, one that -is widely used among developers working in that language. - - The "System Libraries" of an executable work include anything, other -than the work as a whole, that (a) is included in the normal form of -packaging a Major Component, but which is not part of that Major -Component, and (b) serves only to enable use of the work with that -Major Component, or to implement a Standard Interface for which an -implementation is available to the public in source code form. A -"Major Component", in this context, means a major essential component -(kernel, window system, and so on) of the specific operating system -(if any) on which the executable work runs, or a compiler used to -produce the work, or an object code interpreter used to run it. - - The "Corresponding Source" for a work in object code form means all -the source code needed to generate, install, and (for an executable -work) run the object code and to modify the work, including scripts to -control those activities. However, it does not include the work's -System Libraries, or general-purpose tools or generally available free -programs which are used unmodified in performing those activities but -which are not part of the work. For example, Corresponding Source -includes interface definition files associated with source files for -the work, and the source code for shared libraries and dynamically -linked subprograms that the work is specifically designed to require, -such as by intimate data communication or control flow between those -subprograms and other parts of the work. - - The Corresponding Source need not include anything that users -can regenerate automatically from other parts of the Corresponding -Source. - - The Corresponding Source for a work in source code form is that -same work. - - 2. Basic Permissions. - - All rights granted under this License are granted for the term of -copyright on the Program, and are irrevocable provided the stated -conditions are met. This License explicitly affirms your unlimited -permission to run the unmodified Program. The output from running a -covered work is covered by this License only if the output, given its -content, constitutes a covered work. This License acknowledges your -rights of fair use or other equivalent, as provided by copyright law. - - You may make, run and propagate covered works that you do not -convey, without conditions so long as your license otherwise remains -in force. You may convey covered works to others for the sole purpose -of having them make modifications exclusively for you, or provide you -with facilities for running those works, provided that you comply with -the terms of this License in conveying all material for which you do -not control copyright. Those thus making or running the covered works -for you must do so exclusively on your behalf, under your direction -and control, on terms that prohibit them from making any copies of -your copyrighted material outside their relationship with you. - - Conveying under any other circumstances is permitted solely under -the conditions stated below. Sublicensing is not allowed; section 10 -makes it unnecessary. - - 3. Protecting Users' Legal Rights From Anti-Circumvention Law. - - No covered work shall be deemed part of an effective technological -measure under any applicable law fulfilling obligations under article -11 of the WIPO copyright treaty adopted on 20 December 1996, or -similar laws prohibiting or restricting circumvention of such -measures. - - When you convey a covered work, you waive any legal power to forbid -circumvention of technological measures to the extent such circumvention -is effected by exercising rights under this License with respect to -the covered work, and you disclaim any intention to limit operation or -modification of the work as a means of enforcing, against the work's -users, your or third parties' legal rights to forbid circumvention of -technological measures. - - 4. Conveying Verbatim Copies. - - You may convey verbatim copies of the Program's source code as you -receive it, in any medium, provided that you conspicuously and -appropriately publish on each copy an appropriate copyright notice; -keep intact all notices stating that this License and any -non-permissive terms added in accord with section 7 apply to the code; -keep intact all notices of the absence of any warranty; and give all -recipients a copy of this License along with the Program. - - You may charge any price or no price for each copy that you convey, -and you may offer support or warranty protection for a fee. - - 5. Conveying Modified Source Versions. - - You may convey a work based on the Program, or the modifications to -produce it from the Program, in the form of source code under the -terms of section 4, provided that you also meet all of these conditions: - - a) The work must carry prominent notices stating that you modified - it, and giving a relevant date. - - b) The work must carry prominent notices stating that it is - released under this License and any conditions added under section - 7. This requirement modifies the requirement in section 4 to - "keep intact all notices". - - c) You must license the entire work, as a whole, under this - License to anyone who comes into possession of a copy. This - License will therefore apply, along with any applicable section 7 - additional terms, to the whole of the work, and all its parts, - regardless of how they are packaged. This License gives no - permission to license the work in any other way, but it does not - invalidate such permission if you have separately received it. - - d) If the work has interactive user interfaces, each must display - Appropriate Legal Notices; however, if the Program has interactive - interfaces that do not display Appropriate Legal Notices, your - work need not make them do so. - - A compilation of a covered work with other separate and independent -works, which are not by their nature extensions of the covered work, -and which are not combined with it such as to form a larger program, -in or on a volume of a storage or distribution medium, is called an -"aggregate" if the compilation and its resulting copyright are not -used to limit the access or legal rights of the compilation's users -beyond what the individual works permit. Inclusion of a covered work -in an aggregate does not cause this License to apply to the other -parts of the aggregate. - - 6. Conveying Non-Source Forms. - - You may convey a covered work in object code form under the terms -of sections 4 and 5, provided that you also convey the -machine-readable Corresponding Source under the terms of this License, -in one of these ways: - - a) Convey the object code in, or embodied in, a physical product - (including a physical distribution medium), accompanied by the - Corresponding Source fixed on a durable physical medium - customarily used for software interchange. - - b) Convey the object code in, or embodied in, a physical product - (including a physical distribution medium), accompanied by a - written offer, valid for at least three years and valid for as - long as you offer spare parts or customer support for that product - model, to give anyone who possesses the object code either (1) a - copy of the Corresponding Source for all the software in the - product that is covered by this License, on a durable physical - medium customarily used for software interchange, for a price no - more than your reasonable cost of physically performing this - conveying of source, or (2) access to copy the - Corresponding Source from a network server at no charge. - - c) Convey individual copies of the object code with a copy of the - written offer to provide the Corresponding Source. This - alternative is allowed only occasionally and noncommercially, and - only if you received the object code with such an offer, in accord - with subsection 6b. - - d) Convey the object code by offering access from a designated - place (gratis or for a charge), and offer equivalent access to the - Corresponding Source in the same way through the same place at no - further charge. You need not require recipients to copy the - Corresponding Source along with the object code. If the place to - copy the object code is a network server, the Corresponding Source - may be on a different server (operated by you or a third party) - that supports equivalent copying facilities, provided you maintain - clear directions next to the object code saying where to find the - Corresponding Source. Regardless of what server hosts the - Corresponding Source, you remain obligated to ensure that it is - available for as long as needed to satisfy these requirements. - - e) Convey the object code using peer-to-peer transmission, provided - you inform other peers where the object code and Corresponding - Source of the work are being offered to the general public at no - charge under subsection 6d. - - A separable portion of the object code, whose source code is excluded -from the Corresponding Source as a System Library, need not be -included in conveying the object code work. - - A "User Product" is either (1) a "consumer product", which means any -tangible personal property which is normally used for personal, family, -or household purposes, or (2) anything designed or sold for incorporation -into a dwelling. In determining whether a product is a consumer product, -doubtful cases shall be resolved in favor of coverage. For a particular -product received by a particular user, "normally used" refers to a -typical or common use of that class of product, regardless of the status -of the particular user or of the way in which the particular user -actually uses, or expects or is expected to use, the product. A product -is a consumer product regardless of whether the product has substantial -commercial, industrial or non-consumer uses, unless such uses represent -the only significant mode of use of the product. - - "Installation Information" for a User Product means any methods, -procedures, authorization keys, or other information required to install -and execute modified versions of a covered work in that User Product from -a modified version of its Corresponding Source. The information must -suffice to ensure that the continued functioning of the modified object -code is in no case prevented or interfered with solely because -modification has been made. - - If you convey an object code work under this section in, or with, or -specifically for use in, a User Product, and the conveying occurs as -part of a transaction in which the right of possession and use of the -User Product is transferred to the recipient in perpetuity or for a -fixed term (regardless of how the transaction is characterized), the -Corresponding Source conveyed under this section must be accompanied -by the Installation Information. But this requirement does not apply -if neither you nor any third party retains the ability to install -modified object code on the User Product (for example, the work has -been installed in ROM). - - The requirement to provide Installation Information does not include a -requirement to continue to provide support service, warranty, or updates -for a work that has been modified or installed by the recipient, or for -the User Product in which it has been modified or installed. Access to a -network may be denied when the modification itself materially and -adversely affects the operation of the network or violates the rules and -protocols for communication across the network. - - Corresponding Source conveyed, and Installation Information provided, -in accord with this section must be in a format that is publicly -documented (and with an implementation available to the public in -source code form), and must require no special password or key for -unpacking, reading or copying. - - 7. Additional Terms. - - "Additional permissions" are terms that supplement the terms of this -License by making exceptions from one or more of its conditions. -Additional permissions that are applicable to the entire Program shall -be treated as though they were included in this License, to the extent -that they are valid under applicable law. If additional permissions -apply only to part of the Program, that part may be used separately -under those permissions, but the entire Program remains governed by -this License without regard to the additional permissions. - - When you convey a copy of a covered work, you may at your option -remove any additional permissions from that copy, or from any part of -it. (Additional permissions may be written to require their own -removal in certain cases when you modify the work.) You may place -additional permissions on material, added by you to a covered work, -for which you have or can give appropriate copyright permission. - - Notwithstanding any other provision of this License, for material you -add to a covered work, you may (if authorized by the copyright holders of -that material) supplement the terms of this License with terms: - - a) Disclaiming warranty or limiting liability differently from the - terms of sections 15 and 16 of this License; or - - b) Requiring preservation of specified reasonable legal notices or - author attributions in that material or in the Appropriate Legal - Notices displayed by works containing it; or - - c) Prohibiting misrepresentation of the origin of that material, or - requiring that modified versions of such material be marked in - reasonable ways as different from the original version; or - - d) Limiting the use for publicity purposes of names of licensors or - authors of the material; or - - e) Declining to grant rights under trademark law for use of some - trade names, trademarks, or service marks; or - - f) Requiring indemnification of licensors and authors of that - material by anyone who conveys the material (or modified versions of - it) with contractual assumptions of liability to the recipient, for - any liability that these contractual assumptions directly impose on - those licensors and authors. - - All other non-permissive additional terms are considered "further -restrictions" within the meaning of section 10. If the Program as you -received it, or any part of it, contains a notice stating that it is -governed by this License along with a term that is a further -restriction, you may remove that term. If a license document contains -a further restriction but permits relicensing or conveying under this -License, you may add to a covered work material governed by the terms -of that license document, provided that the further restriction does -not survive such relicensing or conveying. - - If you add terms to a covered work in accord with this section, you -must place, in the relevant source files, a statement of the -additional terms that apply to those files, or a notice indicating -where to find the applicable terms. - - Additional terms, permissive or non-permissive, may be stated in the -form of a separately written license, or stated as exceptions; -the above requirements apply either way. - - 8. Termination. - - You may not propagate or modify a covered work except as expressly -provided under this License. Any attempt otherwise to propagate or -modify it is void, and will automatically terminate your rights under -this License (including any patent licenses granted under the third -paragraph of section 11). - - However, if you cease all violation of this License, then your -license from a particular copyright holder is reinstated (a) -provisionally, unless and until the copyright holder explicitly and -finally terminates your license, and (b) permanently, if the copyright -holder fails to notify you of the violation by some reasonable means -prior to 60 days after the cessation. - - Moreover, your license from a particular copyright holder is -reinstated permanently if the copyright holder notifies you of the -violation by some reasonable means, this is the first time you have -received notice of violation of this License (for any work) from that -copyright holder, and you cure the violation prior to 30 days after -your receipt of the notice. - - Termination of your rights under this section does not terminate the -licenses of parties who have received copies or rights from you under -this License. If your rights have been terminated and not permanently -reinstated, you do not qualify to receive new licenses for the same -material under section 10. - - 9. Acceptance Not Required for Having Copies. - - You are not required to accept this License in order to receive or -run a copy of the Program. Ancillary propagation of a covered work -occurring solely as a consequence of using peer-to-peer transmission -to receive a copy likewise does not require acceptance. However, -nothing other than this License grants you permission to propagate or -modify any covered work. These actions infringe copyright if you do -not accept this License. Therefore, by modifying or propagating a -covered work, you indicate your acceptance of this License to do so. - - 10. Automatic Licensing of Downstream Recipients. - - Each time you convey a covered work, the recipient automatically -receives a license from the original licensors, to run, modify and -propagate that work, subject to this License. You are not responsible -for enforcing compliance by third parties with this License. - - An "entity transaction" is a transaction transferring control of an -organization, or substantially all assets of one, or subdividing an -organization, or merging organizations. If propagation of a covered -work results from an entity transaction, each party to that -transaction who receives a copy of the work also receives whatever -licenses to the work the party's predecessor in interest had or could -give under the previous paragraph, plus a right to possession of the -Corresponding Source of the work from the predecessor in interest, if -the predecessor has it or can get it with reasonable efforts. - - You may not impose any further restrictions on the exercise of the -rights granted or affirmed under this License. For example, you may -not impose a license fee, royalty, or other charge for exercise of -rights granted under this License, and you may not initiate litigation -(including a cross-claim or counterclaim in a lawsuit) alleging that -any patent claim is infringed by making, using, selling, offering for -sale, or importing the Program or any portion of it. - - 11. Patents. - - A "contributor" is a copyright holder who authorizes use under this -License of the Program or a work on which the Program is based. The -work thus licensed is called the contributor's "contributor version". - - A contributor's "essential patent claims" are all patent claims -owned or controlled by the contributor, whether already acquired or -hereafter acquired, that would be infringed by some manner, permitted -by this License, of making, using, or selling its contributor version, -but do not include claims that would be infringed only as a -consequence of further modification of the contributor version. For -purposes of this definition, "control" includes the right to grant -patent sublicenses in a manner consistent with the requirements of -this License. - - Each contributor grants you a non-exclusive, worldwide, royalty-free -patent license under the contributor's essential patent claims, to -make, use, sell, offer for sale, import and otherwise run, modify and -propagate the contents of its contributor version. - - In the following three paragraphs, a "patent license" is any express -agreement or commitment, however denominated, not to enforce a patent -(such as an express permission to practice a patent or covenant not to -sue for patent infringement). To "grant" such a patent license to a -party means to make such an agreement or commitment not to enforce a -patent against the party. - - If you convey a covered work, knowingly relying on a patent license, -and the Corresponding Source of the work is not available for anyone -to copy, free of charge and under the terms of this License, through a -publicly available network server or other readily accessible means, -then you must either (1) cause the Corresponding Source to be so -available, or (2) arrange to deprive yourself of the benefit of the -patent license for this particular work, or (3) arrange, in a manner -consistent with the requirements of this License, to extend the patent -license to downstream recipients. "Knowingly relying" means you have -actual knowledge that, but for the patent license, your conveying the -covered work in a country, or your recipient's use of the covered work -in a country, would infringe one or more identifiable patents in that -country that you have reason to believe are valid. - - If, pursuant to or in connection with a single transaction or -arrangement, you convey, or propagate by procuring conveyance of, a -covered work, and grant a patent license to some of the parties -receiving the covered work authorizing them to use, propagate, modify -or convey a specific copy of the covered work, then the patent license -you grant is automatically extended to all recipients of the covered -work and works based on it. - - A patent license is "discriminatory" if it does not include within -the scope of its coverage, prohibits the exercise of, or is -conditioned on the non-exercise of one or more of the rights that are -specifically granted under this License. You may not convey a covered -work if you are a party to an arrangement with a third party that is -in the business of distributing software, under which you make payment -to the third party based on the extent of your activity of conveying -the work, and under which the third party grants, to any of the -parties who would receive the covered work from you, a discriminatory -patent license (a) in connection with copies of the covered work -conveyed by you (or copies made from those copies), or (b) primarily -for and in connection with specific products or compilations that -contain the covered work, unless you entered into that arrangement, -or that patent license was granted, prior to 28 March 2007. - - Nothing in this License shall be construed as excluding or limiting -any implied license or other defenses to infringement that may -otherwise be available to you under applicable patent law. - - 12. No Surrender of Others' Freedom. - - If conditions are imposed on you (whether by court order, agreement or -otherwise) that contradict the conditions of this License, they do not -excuse you from the conditions of this License. If you cannot convey a -covered work so as to satisfy simultaneously your obligations under this -License and any other pertinent obligations, then as a consequence you may -not convey it at all. For example, if you agree to terms that obligate you -to collect a royalty for further conveying from those to whom you convey -the Program, the only way you could satisfy both those terms and this -License would be to refrain entirely from conveying the Program. - - 13. Remote Network Interaction; Use with the GNU General Public License. - - Notwithstanding any other provision of this License, if you modify the -Program, your modified version must prominently offer all users -interacting with it remotely through a computer network (if your version -supports such interaction) an opportunity to receive the Corresponding -Source of your version by providing access to the Corresponding Source -from a network server at no charge, through some standard or customary -means of facilitating copying of software. This Corresponding Source -shall include the Corresponding Source for any work covered by version 3 -of the GNU General Public License that is incorporated pursuant to the -following paragraph. - - Notwithstanding any other provision of this License, you have -permission to link or combine any covered work with a work licensed -under version 3 of the GNU General Public License into a single -combined work, and to convey the resulting work. The terms of this -License will continue to apply to the part which is the covered work, -but the work with which it is combined will remain governed by version -3 of the GNU General Public License. - - 14. Revised Versions of this License. - - The Free Software Foundation may publish revised and/or new versions of -the GNU Affero General Public License from time to time. Such new versions -will be similar in spirit to the present version, but may differ in detail to -address new problems or concerns. - - Each version is given a distinguishing version number. If the -Program specifies that a certain numbered version of the GNU Affero General -Public License "or any later version" applies to it, you have the -option of following the terms and conditions either of that numbered -version or of any later version published by the Free Software -Foundation. If the Program does not specify a version number of the -GNU Affero General Public License, you may choose any version ever published -by the Free Software Foundation. - - If the Program specifies that a proxy can decide which future -versions of the GNU Affero General Public License can be used, that proxy's -public statement of acceptance of a version permanently authorizes you -to choose that version for the Program. - - Later license versions may give you additional or different -permissions. However, no additional obligations are imposed on any -author or copyright holder as a result of your choosing to follow a -later version. - - 15. Disclaimer of Warranty. - - THERE IS NO WARRANTY FOR THE PROGRAM, TO THE EXTENT PERMITTED BY -APPLICABLE LAW. EXCEPT WHEN OTHERWISE STATED IN WRITING THE COPYRIGHT -HOLDERS AND/OR OTHER PARTIES PROVIDE THE PROGRAM "AS IS" WITHOUT WARRANTY -OF ANY KIND, EITHER EXPRESSED OR IMPLIED, INCLUDING, BUT NOT LIMITED TO, -THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR -PURPOSE. THE ENTIRE RISK AS TO THE QUALITY AND PERFORMANCE OF THE PROGRAM -IS WITH YOU. SHOULD THE PROGRAM PROVE DEFECTIVE, YOU ASSUME THE COST OF -ALL NECESSARY SERVICING, REPAIR OR CORRECTION. - - 16. Limitation of Liability. - - IN NO EVENT UNLESS REQUIRED BY APPLICABLE LAW OR AGREED TO IN WRITING -WILL ANY COPYRIGHT HOLDER, OR ANY OTHER PARTY WHO MODIFIES AND/OR CONVEYS -THE PROGRAM AS PERMITTED ABOVE, BE LIABLE TO YOU FOR DAMAGES, INCLUDING ANY -GENERAL, SPECIAL, INCIDENTAL OR CONSEQUENTIAL DAMAGES ARISING OUT OF THE -USE OR INABILITY TO USE THE PROGRAM (INCLUDING BUT NOT LIMITED TO LOSS OF -DATA OR DATA BEING RENDERED INACCURATE OR LOSSES SUSTAINED BY YOU OR THIRD -PARTIES OR A FAILURE OF THE PROGRAM TO OPERATE WITH ANY OTHER PROGRAMS), -EVEN IF SUCH HOLDER OR OTHER PARTY HAS BEEN ADVISED OF THE POSSIBILITY OF -SUCH DAMAGES. - - 17. Interpretation of Sections 15 and 16. - - If the disclaimer of warranty and limitation of liability provided -above cannot be given local legal effect according to their terms, -reviewing courts shall apply local law that most closely approximates -an absolute waiver of all civil liability in connection with the -Program, unless a warranty or assumption of liability accompanies a -copy of the Program in return for a fee. - - END OF TERMS AND CONDITIONS - - How to Apply These Terms to Your New Programs - - If you develop a new program, and you want it to be of the greatest -possible use to the public, the best way to achieve this is to make it -free software which everyone can redistribute and change under these terms. - - To do so, attach the following notices to the program. It is safest -to attach them to the start of each source file to most effectively -state the exclusion of warranty; and each file should have at least -the "copyright" line and a pointer to where the full notice is found. - - - Copyright (C) - - This program is free software: you can redistribute it and/or modify - it under the terms of the GNU Affero General Public License as published - by the Free Software Foundation, either version 3 of the License, or - (at your option) any later version. - - This program is distributed in the hope that it will be useful, - but WITHOUT ANY WARRANTY; without even the implied warranty of - MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the - GNU Affero General Public License for more details. - - You should have received a copy of the GNU Affero General Public License - along with this program. If not, see . - -Also add information on how to contact you by electronic and paper mail. - - If your software can interact with users remotely through a computer -network, you should also make sure that it provides a way for users to -get its source. For example, if your program is a web application, its -interface could display a "Source" link that leads users to an archive -of the code. There are many ways you could offer source, and different -solutions will be better for different programs; see section 13 for the -specific requirements. - - You should also get your employer (if you work as a programmer) or school, -if any, to sign a "copyright disclaimer" for the program, if necessary. -For more information on this, and how to apply and follow the GNU AGPL, see -. diff --git a/PixArt/conf.py b/PixArt/conf.py deleted file mode 100644 index 128146f..0000000 --- a/PixArt/conf.py +++ /dev/null @@ -1,140 +0,0 @@ -""" -List of all PixArt model types / settings -""" - -sampling_settings = { - "beta_schedule" : "sqrt_linear", - "linear_start" : 0.0001, - "linear_end" : 0.02, - "timesteps" : 1000, -} - -pixart_conf = { - "PixArtMS_XL_2": { # models/PixArtMS - "target": "PixArtMS", - "unet_config": { - "input_size" : 1024//8, - "depth" : 28, - "num_heads" : 16, - "patch_size" : 2, - "hidden_size" : 1152, - "pe_interpolation": 2, - }, - "sampling_settings" : sampling_settings, - }, - "PixArtMS_Sigma_XL_2": { - "target": "PixArtMSSigma", - "unet_config": { - "input_size" : 1024//8, - "token_num" : 300, - "depth" : 28, - "num_heads" : 16, - "patch_size" : 2, - "hidden_size" : 1152, - "micro_condition": False, - "pe_interpolation": 2, - "model_max_length": 300, - }, - "sampling_settings" : sampling_settings, - }, - "PixArtMS_Sigma_XL_2_900M": { - "target": "PixArtMSSigma", - "unet_config": { - "input_size": 1024 // 8, - "token_num": 300, - "depth": 42, - "num_heads": 16, - "patch_size": 2, - "hidden_size": 1152, - "micro_condition": False, - "pe_interpolation": 2, - "model_max_length": 300, - }, - "sampling_settings": sampling_settings, - }, - "PixArtMS_Sigma_XL_2_2K": { - "target": "PixArtMSSigma", - "unet_config": { - "input_size" : 2048//8, - "token_num" : 300, - "depth" : 28, - "num_heads" : 16, - "patch_size" : 2, - "hidden_size" : 1152, - "micro_condition": False, - "pe_interpolation": 4, - "model_max_length": 300, - }, - "sampling_settings" : sampling_settings, - }, - "PixArt_XL_2": { # models/PixArt - "target": "PixArt", - "unet_config": { - "input_size" : 512//8, - "token_num" : 120, - "depth" : 28, - "num_heads" : 16, - "patch_size" : 2, - "hidden_size" : 1152, - "pe_interpolation": 1, - }, - "sampling_settings" : sampling_settings, - }, -} - -pixart_conf.update({ # controlnet models - "ControlPixArtHalf": { - "target": "ControlPixArtHalf", - "unet_config": pixart_conf["PixArt_XL_2"]["unet_config"], - "sampling_settings": pixart_conf["PixArt_XL_2"]["sampling_settings"], - }, - "ControlPixArtMSHalf": { - "target": "ControlPixArtMSHalf", - "unet_config": pixart_conf["PixArtMS_XL_2"]["unet_config"], - "sampling_settings": pixart_conf["PixArtMS_XL_2"]["sampling_settings"], - } -}) - -pixart_res = { - "PixArtMS_XL_2": { # models/PixArtMS 1024x1024 - '0.25': [512, 2048], '0.26': [512, 1984], '0.27': [512, 1920], '0.28': [512, 1856], - '0.32': [576, 1792], '0.33': [576, 1728], '0.35': [576, 1664], '0.40': [640, 1600], - '0.42': [640, 1536], '0.48': [704, 1472], '0.50': [704, 1408], '0.52': [704, 1344], - '0.57': [768, 1344], '0.60': [768, 1280], '0.68': [832, 1216], '0.72': [832, 1152], - '0.78': [896, 1152], '0.82': [896, 1088], '0.88': [960, 1088], '0.94': [960, 1024], - '1.00': [1024,1024], '1.07': [1024, 960], '1.13': [1088, 960], '1.21': [1088, 896], - '1.29': [1152, 896], '1.38': [1152, 832], '1.46': [1216, 832], '1.67': [1280, 768], - '1.75': [1344, 768], '2.00': [1408, 704], '2.09': [1472, 704], '2.40': [1536, 640], - '2.50': [1600, 640], '2.89': [1664, 576], '3.00': [1728, 576], '3.11': [1792, 576], - '3.62': [1856, 512], '3.75': [1920, 512], '3.88': [1984, 512], '4.00': [2048, 512], - }, - "PixArt_XL_2": { # models/PixArt 512x512 - '0.25': [256,1024], '0.26': [256, 992], '0.27': [256, 960], '0.28': [256, 928], - '0.32': [288, 896], '0.33': [288, 864], '0.35': [288, 832], '0.40': [320, 800], - '0.42': [320, 768], '0.48': [352, 736], '0.50': [352, 704], '0.52': [352, 672], - '0.57': [384, 672], '0.60': [384, 640], '0.68': [416, 608], '0.72': [416, 576], - '0.78': [448, 576], '0.82': [448, 544], '0.88': [480, 544], '0.94': [480, 512], - '1.00': [512, 512], '1.07': [512, 480], '1.13': [544, 480], '1.21': [544, 448], - '1.29': [576, 448], '1.38': [576, 416], '1.46': [608, 416], '1.67': [640, 384], - '1.75': [672, 384], '2.00': [704, 352], '2.09': [736, 352], '2.40': [768, 320], - '2.50': [800, 320], '2.89': [832, 288], '3.00': [864, 288], '3.11': [896, 288], - '3.62': [928, 256], '3.75': [960, 256], '3.88': [992, 256], '4.00': [1024,256] - }, - "PixArtMS_Sigma_XL_2_2K": { - '0.25': [1024, 4096], '0.26': [1024, 3968], '0.27': [1024, 3840], '0.28': [1024, 3712], - '0.32': [1152, 3584], '0.33': [1152, 3456], '0.35': [1152, 3328], '0.40': [1280, 3200], - '0.42': [1280, 3072], '0.48': [1408, 2944], '0.50': [1408, 2816], '0.52': [1408, 2688], - '0.57': [1536, 2688], '0.60': [1536, 2560], '0.68': [1664, 2432], '0.72': [1664, 2304], - '0.78': [1792, 2304], '0.82': [1792, 2176], '0.88': [1920, 2176], '0.94': [1920, 2048], - '1.00': [2048, 2048], '1.07': [2048, 1920], '1.13': [2176, 1920], '1.21': [2176, 1792], - '1.29': [2304, 1792], '1.38': [2304, 1664], '1.46': [2432, 1664], '1.67': [2560, 1536], - '1.75': [2688, 1536], '2.00': [2816, 1408], '2.09': [2944, 1408], '2.40': [3072, 1280], - '2.50': [3200, 1280], '2.89': [3328, 1152], '3.00': [3456, 1152], '3.11': [3584, 1152], - '3.62': [3712, 1024], '3.75': [3840, 1024], '3.88': [3968, 1024], '4.00': [4096, 1024] - } -} -# These should be the same -pixart_res.update({ - "PixArtMS_Sigma_XL_2": pixart_res["PixArtMS_XL_2"], - "PixArtMS_Sigma_XL_2_512": pixart_res["PixArt_XL_2"], -}) diff --git a/PixArt/diffusers_convert.py b/PixArt/diffusers_convert.py index 312ea9d..6013d17 100644 --- a/PixArt/diffusers_convert.py +++ b/PixArt/diffusers_convert.py @@ -4,220 +4,116 @@ import torch conversion_map_ms = [ # for multi_scale_train (MS) - # Resolution - ("csize_embedder.mlp.0.weight", "adaln_single.emb.resolution_embedder.linear_1.weight"), - ("csize_embedder.mlp.0.bias", "adaln_single.emb.resolution_embedder.linear_1.bias"), - ("csize_embedder.mlp.2.weight", "adaln_single.emb.resolution_embedder.linear_2.weight"), - ("csize_embedder.mlp.2.bias", "adaln_single.emb.resolution_embedder.linear_2.bias"), - # Aspect ratio - ("ar_embedder.mlp.0.weight", "adaln_single.emb.aspect_ratio_embedder.linear_1.weight"), - ("ar_embedder.mlp.0.bias", "adaln_single.emb.aspect_ratio_embedder.linear_1.bias"), - ("ar_embedder.mlp.2.weight", "adaln_single.emb.aspect_ratio_embedder.linear_2.weight"), - ("ar_embedder.mlp.2.bias", "adaln_single.emb.aspect_ratio_embedder.linear_2.bias"), + # Resolution + ("csize_embedder.mlp.0.weight", "adaln_single.emb.resolution_embedder.linear_1.weight"), + ("csize_embedder.mlp.0.bias", "adaln_single.emb.resolution_embedder.linear_1.bias"), + ("csize_embedder.mlp.2.weight", "adaln_single.emb.resolution_embedder.linear_2.weight"), + ("csize_embedder.mlp.2.bias", "adaln_single.emb.resolution_embedder.linear_2.bias"), + # Aspect ratio + ("ar_embedder.mlp.0.weight", "adaln_single.emb.aspect_ratio_embedder.linear_1.weight"), + ("ar_embedder.mlp.0.bias", "adaln_single.emb.aspect_ratio_embedder.linear_1.bias"), + ("ar_embedder.mlp.2.weight", "adaln_single.emb.aspect_ratio_embedder.linear_2.weight"), + ("ar_embedder.mlp.2.bias", "adaln_single.emb.aspect_ratio_embedder.linear_2.bias"), ] def get_depth(state_dict): - return sum(key.endswith('.attn1.to_k.bias') for key in state_dict.keys()) + return sum(key.endswith('.attn1.to_k.bias') for key in state_dict.keys()) def get_lora_depth(state_dict): - cnt = max([ - sum(key.endswith('.attn1.to_k.lora_A.weight') for key in state_dict.keys()), - sum(key.endswith('_attn1_to_k.lora_A.weight') for key in state_dict.keys()), - sum(key.endswith('.attn1.to_k.lora_up.weight') for key in state_dict.keys()), - sum(key.endswith('_attn1_to_k.lora_up.weight') for key in state_dict.keys()), - ]) - assert cnt > 0, "Unable to detect model depth!" - return cnt + cnt = max([ + sum(key.endswith('.attn1.to_k.lora_A.weight') for key in state_dict.keys()), + sum(key.endswith('_attn1_to_k.lora_A.weight') for key in state_dict.keys()), + sum(key.endswith('.attn1.to_k.lora_up.weight') for key in state_dict.keys()), + sum(key.endswith('_attn1_to_k.lora_up.weight') for key in state_dict.keys()), + ]) + assert cnt > 0, "Unable to detect model depth!" + return cnt def get_conversion_map(state_dict): - conversion_map = [ # main SD conversion map (PixArt reference, HF Diffusers) - # Patch embeddings - ("x_embedder.proj.weight", "pos_embed.proj.weight"), - ("x_embedder.proj.bias", "pos_embed.proj.bias"), - # Caption projection - ("y_embedder.y_embedding", "caption_projection.y_embedding"), - ("y_embedder.y_proj.fc1.weight", "caption_projection.linear_1.weight"), - ("y_embedder.y_proj.fc1.bias", "caption_projection.linear_1.bias"), - ("y_embedder.y_proj.fc2.weight", "caption_projection.linear_2.weight"), - ("y_embedder.y_proj.fc2.bias", "caption_projection.linear_2.bias"), - # AdaLN-single LN - ("t_embedder.mlp.0.weight", "adaln_single.emb.timestep_embedder.linear_1.weight"), - ("t_embedder.mlp.0.bias", "adaln_single.emb.timestep_embedder.linear_1.bias"), - ("t_embedder.mlp.2.weight", "adaln_single.emb.timestep_embedder.linear_2.weight"), - ("t_embedder.mlp.2.bias", "adaln_single.emb.timestep_embedder.linear_2.bias"), - # Shared norm - ("t_block.1.weight", "adaln_single.linear.weight"), - ("t_block.1.bias", "adaln_single.linear.bias"), - # Final block - ("final_layer.linear.weight", "proj_out.weight"), - ("final_layer.linear.bias", "proj_out.bias"), - ("final_layer.scale_shift_table", "scale_shift_table"), - ] - - # Add actual transformer blocks - for depth in range(get_depth(state_dict)): - # Transformer blocks - conversion_map += [ - (f"blocks.{depth}.scale_shift_table", f"transformer_blocks.{depth}.scale_shift_table"), - # Projection - (f"blocks.{depth}.attn.proj.weight", f"transformer_blocks.{depth}.attn1.to_out.0.weight"), - (f"blocks.{depth}.attn.proj.bias", f"transformer_blocks.{depth}.attn1.to_out.0.bias"), - # Feed-forward - (f"blocks.{depth}.mlp.fc1.weight", f"transformer_blocks.{depth}.ff.net.0.proj.weight"), - (f"blocks.{depth}.mlp.fc1.bias", f"transformer_blocks.{depth}.ff.net.0.proj.bias"), - (f"blocks.{depth}.mlp.fc2.weight", f"transformer_blocks.{depth}.ff.net.2.weight"), - (f"blocks.{depth}.mlp.fc2.bias", f"transformer_blocks.{depth}.ff.net.2.bias"), - # Cross-attention (proj) - (f"blocks.{depth}.cross_attn.proj.weight" ,f"transformer_blocks.{depth}.attn2.to_out.0.weight"), - (f"blocks.{depth}.cross_attn.proj.bias" ,f"transformer_blocks.{depth}.attn2.to_out.0.bias"), - ] - return conversion_map + conversion_map = [ # main SD conversion map (PixArt reference, HF Diffusers) + # Patch embeddings + ("x_embedder.proj.weight", "pos_embed.proj.weight"), + ("x_embedder.proj.bias", "pos_embed.proj.bias"), + # Caption projection + ("y_embedder.y_embedding", "caption_projection.y_embedding"), + ("y_embedder.y_proj.fc1.weight", "caption_projection.linear_1.weight"), + ("y_embedder.y_proj.fc1.bias", "caption_projection.linear_1.bias"), + ("y_embedder.y_proj.fc2.weight", "caption_projection.linear_2.weight"), + ("y_embedder.y_proj.fc2.bias", "caption_projection.linear_2.bias"), + # AdaLN-single LN + ("t_embedder.mlp.0.weight", "adaln_single.emb.timestep_embedder.linear_1.weight"), + ("t_embedder.mlp.0.bias", "adaln_single.emb.timestep_embedder.linear_1.bias"), + ("t_embedder.mlp.2.weight", "adaln_single.emb.timestep_embedder.linear_2.weight"), + ("t_embedder.mlp.2.bias", "adaln_single.emb.timestep_embedder.linear_2.bias"), + # Shared norm + ("t_block.1.weight", "adaln_single.linear.weight"), + ("t_block.1.bias", "adaln_single.linear.bias"), + # Final block + ("final_layer.linear.weight", "proj_out.weight"), + ("final_layer.linear.bias", "proj_out.bias"), + ("final_layer.scale_shift_table", "scale_shift_table"), + ] + + # Add actual transformer blocks + for depth in range(get_depth(state_dict)): + # Transformer blocks + conversion_map += [ + (f"blocks.{depth}.scale_shift_table", f"transformer_blocks.{depth}.scale_shift_table"), + # Projection + (f"blocks.{depth}.attn.proj.weight", f"transformer_blocks.{depth}.attn1.to_out.0.weight"), + (f"blocks.{depth}.attn.proj.bias", f"transformer_blocks.{depth}.attn1.to_out.0.bias"), + # Feed-forward + (f"blocks.{depth}.mlp.fc1.weight", f"transformer_blocks.{depth}.ff.net.0.proj.weight"), + (f"blocks.{depth}.mlp.fc1.bias", f"transformer_blocks.{depth}.ff.net.0.proj.bias"), + (f"blocks.{depth}.mlp.fc2.weight", f"transformer_blocks.{depth}.ff.net.2.weight"), + (f"blocks.{depth}.mlp.fc2.bias", f"transformer_blocks.{depth}.ff.net.2.bias"), + # Cross-attention (proj) + (f"blocks.{depth}.cross_attn.proj.weight" ,f"transformer_blocks.{depth}.attn2.to_out.0.weight"), + (f"blocks.{depth}.cross_attn.proj.bias" ,f"transformer_blocks.{depth}.attn2.to_out.0.bias"), + ] + return conversion_map def find_prefix(state_dict, target_key): - prefix = "" - for k in state_dict.keys(): - if k.endswith(target_key): - prefix = k.split(target_key)[0] - break - return prefix + prefix = "" + for k in state_dict.keys(): + if k.endswith(target_key): + prefix = k.split(target_key)[0] + break + return prefix def convert_state_dict(state_dict): - if "adaln_single.emb.resolution_embedder.linear_1.weight" in state_dict.keys(): - cmap = get_conversion_map(state_dict) + conversion_map_ms - else: - cmap = get_conversion_map(state_dict) - - missing = [k for k,v in cmap if v not in state_dict] - new_state_dict = {k: state_dict[v] for k,v in cmap if k not in missing} - matched = list(v for k,v in cmap if v in state_dict.keys()) - - for depth in range(get_depth(state_dict)): - for wb in ["weight", "bias"]: - # Self Attention - key = lambda a: f"transformer_blocks.{depth}.attn1.to_{a}.{wb}" - new_state_dict[f"blocks.{depth}.attn.qkv.{wb}"] = torch.cat(( - state_dict[key('q')], state_dict[key('k')], state_dict[key('v')] - ), dim=0) - matched += [key('q'), key('k'), key('v')] - - # Cross-attention (linear) - key = lambda a: f"transformer_blocks.{depth}.attn2.to_{a}.{wb}" - new_state_dict[f"blocks.{depth}.cross_attn.q_linear.{wb}"] = state_dict[key('q')] - new_state_dict[f"blocks.{depth}.cross_attn.kv_linear.{wb}"] = torch.cat(( - state_dict[key('k')], state_dict[key('v')] - ), dim=0) - matched += [key('q'), key('k'), key('v')] - - if len(matched) < len(state_dict): - print(f"PixArt: UNET conversion has leftover keys! ({len(matched)} vs {len(state_dict)})") - print(list( set(state_dict.keys()) - set(matched) )) - - if len(missing) > 0: - print(f"PixArt: UNET conversion has missing keys!") - print(missing) - - return new_state_dict - -# Same as above but for LoRA weights: -def convert_lora_state_dict(state_dict, peft=True): - # koyha - rep_ak = lambda x: x.replace(".weight", ".lora_down.weight") - rep_bk = lambda x: x.replace(".weight", ".lora_up.weight") - rep_pk = lambda x: x.replace(".weight", ".alpha") - if peft: # peft - rep_ap = lambda x: x.replace(".weight", ".lora_A.weight") - rep_bp = lambda x: x.replace(".weight", ".lora_B.weight") - rep_pp = lambda x: x.replace(".weight", ".alpha") - - prefix = find_prefix(state_dict, "adaln_single.linear.lora_A.weight") - state_dict = {k[len(prefix):]:v for k,v in state_dict.items()} - else: # OneTrainer - rep_ap = lambda x: x.replace(".", "_")[:-7] + ".lora_down.weight" - rep_bp = lambda x: x.replace(".", "_")[:-7] + ".lora_up.weight" - rep_pp = lambda x: x.replace(".", "_")[:-7] + ".alpha" - - prefix = "lora_transformer_" - t5_marker = "lora_te_encoder" - t5_keys = [] - for key in list(state_dict.keys()): - if key.startswith(prefix): - state_dict[key[len(prefix):]] = state_dict.pop(key) - elif t5_marker in key: - t5_keys.append(state_dict.pop(key)) - if len(t5_keys) > 0: - print(f"Text Encoder not supported for PixArt LoRA, ignoring {len(t5_keys)} keys") - - cmap = [] - cmap_unet = get_conversion_map(state_dict) + conversion_map_ms # todo: 512 model - for k, v in cmap_unet: - if v.endswith(".weight"): - cmap.append((rep_ak(k), rep_ap(v))) - cmap.append((rep_bk(k), rep_bp(v))) - if not peft: - cmap.append((rep_pk(k), rep_pp(v))) - - missing = [k for k,v in cmap if v not in state_dict] - new_state_dict = {k: state_dict[v] for k,v in cmap if k not in missing} - matched = list(v for k,v in cmap if v in state_dict.keys()) - - lora_depth = get_lora_depth(state_dict) - for fp, fk in ((rep_ap, rep_ak),(rep_bp, rep_bk)): - for depth in range(lora_depth): - # Self Attention - key = lambda a: fp(f"transformer_blocks.{depth}.attn1.to_{a}.weight") - new_state_dict[fk(f"blocks.{depth}.attn.qkv.weight")] = torch.cat(( - state_dict[key('q')], state_dict[key('k')], state_dict[key('v')] - ), dim=0) - - matched += [key('q'), key('k'), key('v')] - if not peft: - akey = lambda a: rep_pp(f"transformer_blocks.{depth}.attn1.to_{a}.weight") - new_state_dict[rep_pk((f"blocks.{depth}.attn.qkv.weight"))] = state_dict[akey("q")] - matched += [akey('q'), akey('k'), akey('v')] - - # Self Attention projection? - key = lambda a: fp(f"transformer_blocks.{depth}.attn1.to_{a}.weight") - new_state_dict[fk(f"blocks.{depth}.attn.proj.weight")] = state_dict[key('out.0')] - matched += [key('out.0')] - - # Cross-attention (linear) - key = lambda a: fp(f"transformer_blocks.{depth}.attn2.to_{a}.weight") - new_state_dict[fk(f"blocks.{depth}.cross_attn.q_linear.weight")] = state_dict[key('q')] - new_state_dict[fk(f"blocks.{depth}.cross_attn.kv_linear.weight")] = torch.cat(( - state_dict[key('k')], state_dict[key('v')] - ), dim=0) - matched += [key('q'), key('k'), key('v')] - if not peft: - akey = lambda a: rep_pp(f"transformer_blocks.{depth}.attn2.to_{a}.weight") - new_state_dict[rep_pk((f"blocks.{depth}.cross_attn.q_linear.weight"))] = state_dict[akey("q")] - new_state_dict[rep_pk((f"blocks.{depth}.cross_attn.kv_linear.weight"))] = state_dict[akey("k")] - matched += [akey('q'), akey('k'), akey('v')] - - # Cross Attention projection? - key = lambda a: fp(f"transformer_blocks.{depth}.attn2.to_{a}.weight") - new_state_dict[fk(f"blocks.{depth}.cross_attn.proj.weight")] = state_dict[key('out.0')] - matched += [key('out.0')] - - try: - key = fp(f"transformer_blocks.{depth}.ff.net.0.proj.weight") - new_state_dict[fk(f"blocks.{depth}.mlp.fc1.weight")] = state_dict[key] - matched += [key] - except KeyError: - pass - - try: - key = fp(f"transformer_blocks.{depth}.ff.net.2.weight") - new_state_dict[fk(f"blocks.{depth}.mlp.fc2.weight")] = state_dict[key] - matched += [key] - except KeyError: - pass - - if len(matched) < len(state_dict): - print(f"PixArt: LoRA conversion has leftover keys! ({len(matched)} vs {len(state_dict)})") - print(list( set(state_dict.keys()) - set(matched) )) - - if len(missing) > 0: - print(f"PixArt: LoRA conversion has missing keys! (probably)") - print(missing) - - return new_state_dict + if "adaln_single.emb.resolution_embedder.linear_1.weight" in state_dict.keys(): + cmap = get_conversion_map(state_dict) + conversion_map_ms + else: + cmap = get_conversion_map(state_dict) + + missing = [k for k,v in cmap if v not in state_dict] + new_state_dict = {k: state_dict[v] for k,v in cmap if k not in missing} + matched = list(v for k,v in cmap if v in state_dict.keys()) + + for depth in range(get_depth(state_dict)): + for wb in ["weight", "bias"]: + # Self Attention + key = lambda a: f"transformer_blocks.{depth}.attn1.to_{a}.{wb}" + new_state_dict[f"blocks.{depth}.attn.qkv.{wb}"] = torch.cat(( + state_dict[key('q')], state_dict[key('k')], state_dict[key('v')] + ), dim=0) + matched += [key('q'), key('k'), key('v')] + + # Cross-attention (linear) + key = lambda a: f"transformer_blocks.{depth}.attn2.to_{a}.{wb}" + new_state_dict[f"blocks.{depth}.cross_attn.q_linear.{wb}"] = state_dict[key('q')] + new_state_dict[f"blocks.{depth}.cross_attn.kv_linear.{wb}"] = torch.cat(( + state_dict[key('k')], state_dict[key('v')] + ), dim=0) + matched += [key('q'), key('k'), key('v')] + + if len(matched) < len(state_dict): + print(f"PixArt: UNET conversion has leftover keys! ({len(matched)} vs {len(state_dict)})") + print(list( set(state_dict.keys()) - set(matched) )) + + if len(missing) > 0: + print(f"PixArt: UNET conversion has missing keys!") + print(missing) + + return new_state_dict diff --git a/PixArt/loader.py b/PixArt/loader.py index cedb5fd..4fcee7c 100644 --- a/PixArt/loader.py +++ b/PixArt/loader.py @@ -1,180 +1,155 @@ +import math +import logging + +import comfy.utils +import comfy.model_base +import comfy.model_detection + import comfy.supported_models_base +import comfy.supported_models import comfy.latent_formats -import comfy.model_patcher -import comfy.model_base -import comfy.utils -import comfy.conds -import torch -import math -from comfy import model_management -from .diffusers_convert import convert_state_dict -class EXM_PixArt(comfy.supported_models_base.BASE): - unet_config = {} - unet_extra_config = {} - latent_format = comfy.latent_formats.SD15 - - def __init__(self, model_conf): - self.model_target = model_conf.get("target") - self.unet_config = model_conf.get("unet_config", {}) - self.sampling_settings = model_conf.get("sampling_settings", {}) - self.latent_format = self.latent_format() - # UNET is handled by extension - self.unet_config["disable_unet_model_creation"] = True - - def model_type(self, state_dict, prefix=""): - return comfy.model_base.ModelType.EPS - -class EXM_PixArt_Model(comfy.model_base.BaseModel): - def __init__(self, *args, **kwargs): - super().__init__(*args, **kwargs) - - def extra_conds(self, **kwargs): - out = super().extra_conds(**kwargs) - - img_hw = kwargs.get("img_hw", None) - if img_hw is not None: - out["img_hw"] = comfy.conds.CONDRegular(torch.tensor(img_hw)) - - aspect_ratio = kwargs.get("aspect_ratio", None) - if aspect_ratio is not None: - out["aspect_ratio"] = comfy.conds.CONDRegular(torch.tensor(aspect_ratio)) - - cn_hint = kwargs.get("cn_hint", None) - if cn_hint is not None: - out["cn_hint"] = comfy.conds.CONDRegular(cn_hint) - - return out - -def load_pixart(model_path, model_conf=None): - state_dict = comfy.utils.load_torch_file(model_path) - state_dict = state_dict.get("model", state_dict) - - # prefix - for prefix in ["model.diffusion_model.",]: - if any(True for x in state_dict if x.startswith(prefix)): - state_dict = {k[len(prefix):]:v for k,v in state_dict.items()} - - # diffusers - if "adaln_single.linear.weight" in state_dict: - state_dict = convert_state_dict(state_dict) # Diffusers - - # guess auto config - if model_conf is None: - model_conf = guess_pixart_config(state_dict) - - parameters = comfy.utils.calculate_parameters(state_dict) - unet_dtype = model_management.unet_dtype(model_params=parameters) - load_device = comfy.model_management.get_torch_device() - offload_device = comfy.model_management.unet_offload_device() - - # ignore fp8/etc and use directly for now - manual_cast_dtype = model_management.unet_manual_cast(unet_dtype, load_device) - if manual_cast_dtype: - print(f"PixArt: falling back to {manual_cast_dtype}") - unet_dtype = manual_cast_dtype - - model_conf = EXM_PixArt(model_conf) # convert to object - model = EXM_PixArt_Model( # same as comfy.model_base.BaseModel - model_conf, - model_type=comfy.model_base.ModelType.EPS, - device=model_management.get_torch_device() - ) - - if model_conf.model_target == "PixArtMS": - from .models.PixArtMS import PixArtMS - model.diffusion_model = PixArtMS(**model_conf.unet_config) - elif model_conf.model_target == "PixArt": - from .models.PixArt import PixArt - model.diffusion_model = PixArt(**model_conf.unet_config) - elif model_conf.model_target == "PixArtMSSigma": - from .models.PixArtMS import PixArtMS - model.diffusion_model = PixArtMS(**model_conf.unet_config) - model.latent_format = comfy.latent_formats.SDXL() - elif model_conf.model_target == "ControlPixArtMSHalf": - from .models.PixArtMS import PixArtMS - from .models.pixart_controlnet import ControlPixArtMSHalf - model.diffusion_model = PixArtMS(**model_conf.unet_config) - model.diffusion_model = ControlPixArtMSHalf(model.diffusion_model) - elif model_conf.model_target == "ControlPixArtHalf": - from .models.PixArt import PixArt - from .models.pixart_controlnet import ControlPixArtHalf - model.diffusion_model = PixArt(**model_conf.unet_config) - model.diffusion_model = ControlPixArtHalf(model.diffusion_model) - else: - raise NotImplementedError(f"Unknown model target '{model_conf.model_target}'") - - m, u = model.diffusion_model.load_state_dict(state_dict, strict=False) - if len(m) > 0: print("Missing UNET keys", m) - if len(u) > 0: print("Leftover UNET keys", u) - model.diffusion_model.dtype = unet_dtype - model.diffusion_model.eval() - model.diffusion_model.to(unet_dtype) - - model_patcher = comfy.model_patcher.ModelPatcher( - model, - load_device = load_device, - offload_device = offload_device, - ) - return model_patcher - -def guess_pixart_config(sd): - """ - Guess config based on converted state dict. - """ - # Shared settings based on DiT_XL_2 - could be enumerated - config = { - "num_heads" : 16, # get from attention - "patch_size" : 2, # final layer I guess? - "hidden_size" : 1152, # pos_embed.shape[2] - } - config["depth"] = sum([key.endswith(".attn.proj.weight") for key in sd.keys()]) or 28 - - try: - # this is not present in the diffusers version for sigma? - config["model_max_length"] = sd["y_embedder.y_embedding"].shape[0] - except KeyError: - # need better logic to guess this - config["model_max_length"] = 300 - - if "pos_embed" in sd: - config["input_size"] = int(math.sqrt(sd["pos_embed"].shape[1])) * config["patch_size"] - config["pe_interpolation"] = config["input_size"] // (512//8) # dumb guess - - target_arch = "PixArtMS" - if config["model_max_length"] == 300: - # Sigma - target_arch = "PixArtMSSigma" - config["micro_condition"] = False - if "input_size" not in config: - # The diffusers weights for 1K/2K are exactly the same...? - # replace patch embed logic with HyDiT? - print(f"PixArt: diffusers weights - 2K model will be broken, use manual loading!") - config["input_size"] = 1024//8 - else: - # Alpha - if "csize_embedder.mlp.0.weight" in sd: - # MS (microconds) - target_arch = "PixArtMS" - config["micro_condition"] = True - if "input_size" not in config: - config["input_size"] = 1024//8 - config["pe_interpolation"] = 2 - else: - # PixArt - target_arch = "PixArt" - if "input_size" not in config: - config["input_size"] = 512//8 - config["pe_interpolation"] = 1 - - print("PixArt guessed config:", target_arch, config) - return { - "target": target_arch, - "unet_config": config, - "sampling_settings": { - "beta_schedule" : "sqrt_linear", - "linear_start" : 0.0001, - "linear_end" : 0.02, - "timesteps" : 1000, - } - } +from .models.pixart import PixArt +from .models.pixartms import PixArtMS +from .diffusers_convert import convert_state_dict +from ..utils.loader import load_state_dict_from_config +from ..text_encoders.pixart.tenc import PixArtTokenizer, PixArtT5XXL + +class PixArtConfig(comfy.supported_models_base.BASE): + unet_class = PixArtMS + unet_config = {} + unet_extra_config = {} + + latent_format = comfy.latent_formats.SD15 + sampling_settings = { + "beta_schedule" : "sqrt_linear", + "linear_start" : 0.0001, + "linear_end" : 0.02, + "timesteps" : 1000, + } + + def model_type(self, state_dict, prefix=""): + return comfy.model_base.ModelType.EPS + + def get_model(self, state_dict, prefix="", device=None): + return PixArtModel(model_config=self, unet_model=self.unet_class, device=device) + + def clip_target(self, state_dict={}): + return comfy.supported_models_base.ClipTarget(PixArtTokenizer, PixArtT5XXL) + +class PixArtModel(comfy.model_base.BaseModel): + def __init__(self, *args, **kwargs): + super().__init__(*args, **kwargs) + + def extra_conds(self, **kwargs): + out = super().extra_conds(**kwargs) + return out + +def load_pixart_state_dict(sd, model_options={}): + # prefix / format + sd = sd.get("model", sd) # ref ckpt + diffusion_model_prefix = comfy.model_detection.unet_prefix_from_state_dict(sd) + temp_sd = comfy.utils.state_dict_prefix_replace(sd, {diffusion_model_prefix: ""}, filter_keys=True) + if len(temp_sd) > 0: + sd = temp_sd + + # diffusers convert + if "adaln_single.linear.weight" in sd: + sd = convert_state_dict(sd) + + # model config + model_config = model_config_from_unet(sd) + return load_state_dict_from_config(model_config, sd, model_options) + +def model_config_from_unet(sd): + """ + Guess config based on (converted) state dict. + """ + # Shared settings based on DiT_XL_2 - could be enumerated + config = { + "num_heads" : 16, # get from attention + "patch_size" : 2, # final layer I guess? + "hidden_size" : 1152, # pos_embed.shape[2] + } + config["depth"] = sum([key.endswith(".attn.proj.weight") for key in sd.keys()]) or 28 + + try: + # this is not present in the diffusers version for sigma? + config["model_max_length"] = sd["y_embedder.y_embedding"].shape[0] + except KeyError: + # need better logic to guess this + config["model_max_length"] = 300 + + if "pos_embed" in sd: + config["input_size"] = int(math.sqrt(sd["pos_embed"].shape[1])) * config["patch_size"] + config["pe_interpolation"] = config["input_size"] // (512//8) # dumb guess + + model_config = PixArtModel + if config["model_max_length"] == 300: + # Sigma + model_class = PixArtMS + model_config.latent_format = comfy.latent_formats.SDXL + config["micro_condition"] = False + if "input_size" not in config: + # The diffusers weights for 1K/2K are exactly the same...? + # replace patch embed logic with HyDiT? + logging.warn(f"PixArt: diffusers weights - 2K model will be broken, use manual loading!") + config["input_size"] = 1024//8 + else: + # Alpha + if "csize_embedder.mlp.0.weight" in sd: + # MS (microconds) + model_class = PixArtMS + config["micro_condition"] = True + if "input_size" not in config: + config["input_size"] = 1024//8 + config["pe_interpolation"] = 2 + else: + # PixArt + model_class = PixArt + if "input_size" not in config: + config["input_size"] = 512//8 + config["pe_interpolation"] = 1 + model_config = PixArtConfig(config) + model_config.unet_class = model_class + logging.debug(f"PixArt config: {model_class}\n{config}") + return model_config + +resolutions = { + "PixArt 512": { + 0.25: [256,1024], 0.26: [256, 992], 0.27: [256, 960], 0.28: [256, 928], + 0.32: [288, 896], 0.33: [288, 864], 0.35: [288, 832], 0.40: [320, 800], + 0.42: [320, 768], 0.48: [352, 736], 0.50: [352, 704], 0.52: [352, 672], + 0.57: [384, 672], 0.60: [384, 640], 0.68: [416, 608], 0.72: [416, 576], + 0.78: [448, 576], 0.82: [448, 544], 0.88: [480, 544], 0.94: [480, 512], + 1.00: [512, 512], 1.07: [512, 480], 1.13: [544, 480], 1.21: [544, 448], + 1.29: [576, 448], 1.38: [576, 416], 1.46: [608, 416], 1.67: [640, 384], + 1.75: [672, 384], 2.00: [704, 352], 2.09: [736, 352], 2.40: [768, 320], + 2.50: [800, 320], 2.89: [832, 288], 3.00: [864, 288], 3.11: [896, 288], + 3.62: [928, 256], 3.75: [960, 256], 3.88: [992, 256], 4.00: [1024,256] + }, + "PixArt 1024": { + 0.25: [512, 2048], 0.26: [512, 1984], 0.27: [512, 1920], 0.28: [512, 1856], + 0.32: [576, 1792], 0.33: [576, 1728], 0.35: [576, 1664], 0.40: [640, 1600], + 0.42: [640, 1536], 0.48: [704, 1472], 0.50: [704, 1408], 0.52: [704, 1344], + 0.57: [768, 1344], 0.60: [768, 1280], 0.68: [832, 1216], 0.72: [832, 1152], + 0.78: [896, 1152], 0.82: [896, 1088], 0.88: [960, 1088], 0.94: [960, 1024], + 1.00: [1024,1024], 1.07: [1024, 960], 1.13: [1088, 960], 1.21: [1088, 896], + 1.29: [1152, 896], 1.38: [1152, 832], 1.46: [1216, 832], 1.67: [1280, 768], + 1.75: [1344, 768], 2.00: [1408, 704], 2.09: [1472, 704], 2.40: [1536, 640], + 2.50: [1600, 640], 2.89: [1664, 576], 3.00: [1728, 576], 3.11: [1792, 576], + 3.62: [1856, 512], 3.75: [1920, 512], 3.88: [1984, 512], 4.00: [2048, 512], + }, + "PixArt 2K": { + 0.25: [1024, 4096], 0.26: [1024, 3968], 0.27: [1024, 3840], 0.28: [1024, 3712], + 0.32: [1152, 3584], 0.33: [1152, 3456], 0.35: [1152, 3328], 0.40: [1280, 3200], + 0.42: [1280, 3072], 0.48: [1408, 2944], 0.50: [1408, 2816], 0.52: [1408, 2688], + 0.57: [1536, 2688], 0.60: [1536, 2560], 0.68: [1664, 2432], 0.72: [1664, 2304], + 0.78: [1792, 2304], 0.82: [1792, 2176], 0.88: [1920, 2176], 0.94: [1920, 2048], + 1.00: [2048, 2048], 1.07: [2048, 1920], 1.13: [2176, 1920], 1.21: [2176, 1792], + 1.29: [2304, 1792], 1.38: [2304, 1664], 1.46: [2432, 1664], 1.67: [2560, 1536], + 1.75: [2688, 1536], 2.00: [2816, 1408], 2.09: [2944, 1408], 2.40: [3072, 1280], + 2.50: [3200, 1280], 2.89: [3328, 1152], 3.00: [3456, 1152], 3.11: [3584, 1152], + 3.62: [3712, 1024], 3.75: [3840, 1024], 3.88: [3968, 1024], 4.00: [4096, 1024] + } +} diff --git a/PixArt/lora.py b/PixArt/lora.py deleted file mode 100644 index fca5931..0000000 --- a/PixArt/lora.py +++ /dev/null @@ -1,146 +0,0 @@ -import os -import copy -import json -import torch -import comfy.lora -import comfy.model_management -from comfy.model_patcher import ModelPatcher -from .diffusers_convert import convert_lora_state_dict - -class EXM_PixArt_ModelPatcher(ModelPatcher): - def calculate_weight(self, patches, weight, key): - """ - This is almost the same as the comfy function, but stripped down to just the LoRA patch code. - The problem with the original code is the q/k/v keys being combined into one for the attention. - In the diffusers code, they're treated as separate keys, but in the reference code they're recombined (q+kv|qkv). - This means, for example, that the [1152,1152] weights become [3456,1152] in the state dict. - The issue with this is that the LoRA weights are [128,1152],[1152,128] and become [384,1162],[3456,128] instead. - - This is the best thing I could think of that would fix that, but it's very fragile. - - Check key shape to determine if it needs the fallback logic - - Cut the input into parts based on the shape (undoing the torch.cat) - - Do the matrix multiplication logic - - Recombine them to match the expected shape - """ - for p in patches: - alpha = p[0] - v = p[1] - strength_model = p[2] - if strength_model != 1.0: - weight *= strength_model - - if isinstance(v, list): - v = (self.calculate_weight(v[1:], v[0].clone(), key), ) - - if len(v) == 2: - patch_type = v[0] - v = v[1] - - if patch_type == "lora": - mat1 = comfy.model_management.cast_to_device(v[0], weight.device, torch.float32) - mat2 = comfy.model_management.cast_to_device(v[1], weight.device, torch.float32) - if v[2] is not None: - alpha *= v[2] / mat2.shape[0] - try: - mat1 = mat1.flatten(start_dim=1) - mat2 = mat2.flatten(start_dim=1) - - ch1 = mat1.shape[0] // mat2.shape[1] - ch2 = mat2.shape[0] // mat1.shape[1] - ### Fallback logic for shape mismatch ### - if mat1.shape[0] != mat2.shape[1] and ch1 == ch2 and (mat1.shape[0]/mat2.shape[1])%1 == 0: - mat1 = mat1.chunk(ch1, dim=0) - mat2 = mat2.chunk(ch1, dim=0) - weight += torch.cat( - [alpha * torch.mm(mat1[x], mat2[x]) for x in range(ch1)], - dim=0, - ).reshape(weight.shape).type(weight.dtype) - else: - weight += (alpha * torch.mm(mat1, mat2)).reshape(weight.shape).type(weight.dtype) - except Exception as e: - print("ERROR", key, e) - return weight - - def clone(self): - n = EXM_PixArt_ModelPatcher(self.model, self.load_device, self.offload_device, self.size, self.current_device, weight_inplace_update=self.weight_inplace_update) - n.patches = {} - for k in self.patches: - n.patches[k] = self.patches[k][:] - - n.object_patches = self.object_patches.copy() - n.model_options = copy.deepcopy(self.model_options) - n.model_keys = self.model_keys - return n - -def replace_model_patcher(model): - n = EXM_PixArt_ModelPatcher( - model = model.model, - size = model.size, - load_device = model.load_device, - offload_device = model.offload_device, - weight_inplace_update = model.weight_inplace_update, - ) - n.patches = {} - for k in model.patches: - n.patches[k] = model.patches[k][:] - - n.object_patches = model.object_patches.copy() - n.model_options = copy.deepcopy(model.model_options) - return n - -def find_peft_alpha(path): - def load_json(json_path): - with open(json_path) as f: - data = json.load(f) - alpha = data.get("lora_alpha") - alpha = alpha or data.get("alpha") - if not alpha: - print(" Found config but `lora_alpha` is missing!") - else: - print(f" Found config at {json_path} [alpha:{alpha}]") - return alpha - - # For some weird reason peft doesn't include the alpha in the actual model - print("PixArt: Warning! This is a PEFT LoRA. Trying to find config...") - files = [ - f"{os.path.splitext(path)[0]}.json", - f"{os.path.splitext(path)[0]}.config.json", - os.path.join(os.path.dirname(path),"adapter_config.json"), - ] - for file in files: - if os.path.isfile(file): - return load_json(file) - - print(" Missing config/alpha! assuming alpha of 8. Consider converting it/adding a config json to it.") - return 8.0 - -def load_pixart_lora(model, lora, lora_path, strength): - k_back = lambda x: x.replace(".lora_up.weight", "") - # need to convert the actual weights for this to work. - if any(True for x in lora.keys() if x.endswith("adaln_single.linear.lora_A.weight")): - lora = convert_lora_state_dict(lora, peft=True) - alpha = find_peft_alpha(lora_path) - lora.update({f"{k_back(x)}.alpha":torch.tensor(alpha) for x in lora.keys() if "lora_up" in x}) - else: # OneTrainer - lora = convert_lora_state_dict(lora, peft=False) - - key_map = {k_back(x):f"diffusion_model.{k_back(x)}.weight" for x in lora.keys() if "lora_up" in x} # fake - - loaded = comfy.lora.load_lora(lora, key_map) - if model is not None: - # switch to custom model patcher when using LoRAs - if isinstance(model, EXM_PixArt_ModelPatcher): - new_modelpatcher = model.clone() - else: - new_modelpatcher = replace_model_patcher(model) - k = new_modelpatcher.add_patches(loaded, strength) - else: - k = () - new_modelpatcher = None - - k = set(k) - for x in loaded: - if (x not in k): - print("NOT LOADED", x) - - return new_modelpatcher diff --git a/T5/LICENSE-T5 b/PixArt/models/LICENSE similarity index 99% rename from T5/LICENSE-T5 rename to PixArt/models/LICENSE index 261eeb9..fb524b1 100644 --- a/T5/LICENSE-T5 +++ b/PixArt/models/LICENSE @@ -186,7 +186,7 @@ same "printed page" as the copyright notice for easier identification within third-party archives. - Copyright [yyyy] [name of copyright owner] + Copyright 2024 Junsong Chen, Jincheng Yu, Enze Xie Licensed under the Apache License, Version 2.0 (the "License"); you may not use this file except in compliance with the License. diff --git a/PixArt/models/__init__.py b/PixArt/models/__init__.py new file mode 100644 index 0000000..e69de29 diff --git a/PixArt/models/PixArt_blocks.py b/PixArt/models/blocks.py similarity index 72% rename from PixArt/models/PixArt_blocks.py rename to PixArt/models/blocks.py index df2eae0..d540d96 100644 --- a/PixArt/models/PixArt_blocks.py +++ b/PixArt/models/blocks.py @@ -12,9 +12,11 @@ import torch import torch.nn as nn import torch.nn.functional as F -from timm.models.vision_transformer import Mlp, Attention as Attention_ from einops import rearrange +import comfy.ldm.common_dit +from .utils import to_2tuple + sdpa_32b = None Q_4GB_LIMIT = 32000000 """If q is greater than this, the operation will likely require >4GB VRAM, which will fail on Intel Arc Alchemist GPUs without a workaround.""" @@ -24,11 +26,14 @@ from comfy import model_management if model_management.xformers_enabled(): - import xformers import xformers.ops + if int((xformers.__version__).split(".")[2]) >= 28: + block_diagonal_mask_from_seqlens = xformers.ops.fmha.attn_bias.BlockDiagonalMask.from_seqlens + else: + block_diagonal_mask_from_seqlens = xformers.ops.fmha.BlockDiagonalMask.from_seqlens else: if model_management.xpu_available: - import intel_extension_for_pytorch as ipex + import intel_extension_for_pytorch as ipex # type: ignore import os if not torch.xpu.has_fp64_dtype() and not os.environ.get('IPEX_FORCE_ATTENTION_SLICE', None): from ...utils.IPEX.attention import scaled_dot_product_attention_32_bit @@ -44,7 +49,7 @@ def t2i_modulate(x, shift, scale): return x * (1 + scale) + shift class MultiHeadCrossAttention(nn.Module): - def __init__(self, d_model, num_heads, attn_drop=0., proj_drop=0., **block_kwargs): + def __init__(self, d_model, num_heads, attn_drop=0., proj_drop=0., dtype=None, device=None, operations=None, **block_kwargs): super(MultiHeadCrossAttention, self).__init__() assert d_model % num_heads == 0, "d_model must be divisible by num_heads" @@ -52,10 +57,10 @@ def __init__(self, d_model, num_heads, attn_drop=0., proj_drop=0., **block_kwarg self.num_heads = num_heads self.head_dim = d_model // num_heads - self.q_linear = nn.Linear(d_model, d_model) - self.kv_linear = nn.Linear(d_model, d_model*2) + self.q_linear = operations.Linear(d_model, d_model, dtype=dtype, device=device) + self.kv_linear = operations.Linear(d_model, d_model*2, dtype=dtype, device=device) self.attn_drop = nn.Dropout(attn_drop) - self.proj = nn.Linear(d_model, d_model) + self.proj = operations.Linear(d_model, d_model, dtype=dtype, device=device) self.proj_drop = nn.Dropout(proj_drop) def forward(self, x, cond, mask=None): @@ -69,7 +74,7 @@ def forward(self, x, cond, mask=None): if model_management.xformers_enabled(): attn_bias = None if mask is not None: - attn_bias = xformers.ops.fmha.BlockDiagonalMask.from_seqlens([N] * B, mask) + attn_bias = block_diagonal_mask_from_seqlens([N] * B, mask) x = xformers.ops.memory_efficient_attention( q, k, v, p=self.attn_drop.p, @@ -111,7 +116,7 @@ def forward(self, x, cond, mask=None): return x -class AttentionKVCompress(Attention_): +class AttentionKVCompress(nn.Module): """Multi-head Attention block with KV token compression and qk norm.""" def __init__( @@ -122,6 +127,9 @@ def __init__( sampling='conv', sr_ratio=1, qk_norm=False, + dtype=None, + device=None, + operations=None, **block_kwargs, ): """ @@ -130,19 +138,26 @@ def __init__( num_heads (int): Number of attention heads. qkv_bias (bool: If True, add a learnable bias to query, key, value. """ - super().__init__(dim, num_heads=num_heads, qkv_bias=qkv_bias, **block_kwargs) + super().__init__() + assert dim % num_heads == 0, 'dim should be divisible by num_heads' + self.num_heads = num_heads + self.head_dim = dim // num_heads + self.scale = self.head_dim ** -0.5 + + self.qkv = operations.Linear(dim, dim * 3, bias=qkv_bias, dtype=dtype, device=device) + self.proj = operations.Linear(dim, dim, dtype=dtype, device=device) self.sampling=sampling # ['conv', 'ave', 'uniform', 'uniform_every'] self.sr_ratio = sr_ratio if sr_ratio > 1 and sampling == 'conv': # Avg Conv Init. - self.sr = nn.Conv2d(dim, dim, groups=dim, kernel_size=sr_ratio, stride=sr_ratio) - self.sr.weight.data.fill_(1/sr_ratio**2) - self.sr.bias.data.zero_() - self.norm = nn.LayerNorm(dim) + self.sr = operations.Conv2d(dim, dim, groups=dim, kernel_size=sr_ratio, stride=sr_ratio, dtype=dtype, device=device) + # self.sr.weight.data.fill_(1/sr_ratio**2) + # self.sr.bias.data.zero_() + self.norm = operations.LayerNorm(dim, dtype=dtype, device=device) if qk_norm: - self.q_norm = nn.LayerNorm(dim) - self.k_norm = nn.LayerNorm(dim) + self.q_norm = operations.LayerNorm(dim, dtype=dtype, device=device) + self.k_norm = operations.LayerNorm(dim, dtype=dtype, device=device) else: self.q_norm = nn.Identity() self.k_norm = nn.Identity() @@ -204,14 +219,12 @@ def forward(self, x, mask=None, HW=None, block_id=None): if model_management.xformers_enabled(): x = xformers.ops.memory_efficient_attention( q, k, v, - p=self.attn_drop.p, + p=0, attn_bias=attn_bias ) else: q, k, v = map(lambda t: t.transpose(1, 2),(q, k, v),) - - p = getattr(self.attn_drop, "p", 0) # IPEX.optimize() will turn attn_drop into an Identity() - + p = 0 if sdpa_32b is not None and (q.element_size() * q.nelement()) > Q_4GB_LIMIT: sdpa = sdpa_32b else: @@ -224,30 +237,6 @@ def forward(self, x, mask=None, HW=None, block_id=None): ).transpose(1, 2).contiguous() x = x.view(B, N, C) x = self.proj(x) - x = self.proj_drop(x) - return x - - -################################################################################# -# AMP attention with fp32 softmax to fix loss NaN problem during training # -################################################################################# -class Attention(Attention_): - def forward(self, x): - B, N, C = x.shape - qkv = self.qkv(x).reshape(B, N, 3, self.num_heads, C // self.num_heads).permute(2, 0, 3, 1, 4) - q, k, v = qkv.unbind(0) # make torchscript happy (cannot use tensor as tuple) - use_fp32_attention = getattr(self, 'fp32_attention', False) - if use_fp32_attention: - q, k = q.float(), k.float() - with torch.cuda.amp.autocast(enabled=not use_fp32_attention): - attn = (q @ k.transpose(-2, -1)) * self.scale - attn = attn.softmax(dim=-1) - - attn = self.attn_drop(attn) - - x = (attn @ v).transpose(1, 2).reshape(B, N, C) - x = self.proj(x) - x = self.proj_drop(x) return x @@ -256,13 +245,13 @@ class FinalLayer(nn.Module): The final layer of PixArt. """ - def __init__(self, hidden_size, patch_size, out_channels): + def __init__(self, hidden_size, patch_size, out_channels, dtype=None, device=None, operations=None): super().__init__() - self.norm_final = nn.LayerNorm(hidden_size, elementwise_affine=False, eps=1e-6) - self.linear = nn.Linear(hidden_size, patch_size * patch_size * out_channels, bias=True) + self.norm_final = operations.LayerNorm(hidden_size, elementwise_affine=False, eps=1e-6, dtype=dtype, device=device) + self.linear = operations.Linear(hidden_size, patch_size * patch_size * out_channels, bias=True, dtype=dtype, device=device) self.adaLN_modulation = nn.Sequential( nn.SiLU(), - nn.Linear(hidden_size, 2 * hidden_size, bias=True) + operations.Linear(hidden_size, 2 * hidden_size, bias=True, dtype=dtype, device=device) ) def forward(self, x, c): @@ -271,23 +260,23 @@ def forward(self, x, c): x = self.linear(x) return x - class T2IFinalLayer(nn.Module): """ The final layer of PixArt. """ - def __init__(self, hidden_size, patch_size, out_channels): + def __init__(self, hidden_size, patch_size, out_channels, dtype=None, device=None, operations=None): super().__init__() - self.norm_final = nn.LayerNorm(hidden_size, elementwise_affine=False, eps=1e-6) - self.linear = nn.Linear(hidden_size, patch_size * patch_size * out_channels, bias=True) + self.norm_final = operations.LayerNorm(hidden_size, elementwise_affine=False, eps=1e-6, dtype=dtype, device=device) + self.linear = operations.Linear(hidden_size, patch_size * patch_size * out_channels, bias=True, dtype=dtype, device=device) self.scale_shift_table = nn.Parameter(torch.randn(2, hidden_size) / hidden_size ** 0.5) self.out_channels = out_channels def forward(self, x, t): + dtype = x.dtype shift, scale = (self.scale_shift_table[None] + t[:, None]).chunk(2, dim=1) x = t2i_modulate(self.norm_final(x), shift, scale) - x = self.linear(x) + x = self.linear(x.to(dtype)) return x @@ -296,13 +285,13 @@ class MaskFinalLayer(nn.Module): The final layer of PixArt. """ - def __init__(self, final_hidden_size, c_emb_size, patch_size, out_channels): + def __init__(self, final_hidden_size, c_emb_size, patch_size, out_channels, dtype=None, device=None, operations=None): super().__init__() - self.norm_final = nn.LayerNorm(final_hidden_size, elementwise_affine=False, eps=1e-6) - self.linear = nn.Linear(final_hidden_size, patch_size * patch_size * out_channels, bias=True) + self.norm_final = operations.LayerNorm(final_hidden_size, elementwise_affine=False, eps=1e-6, dtype=dtype, device=device) + self.linear = operations.Linear(final_hidden_size, patch_size * patch_size * out_channels, bias=True, dtype=dtype, device=device) self.adaLN_modulation = nn.Sequential( nn.SiLU(), - nn.Linear(c_emb_size, 2 * final_hidden_size, bias=True) + operations.Linear(c_emb_size, 2 * final_hidden_size, bias=True, dtype=dtype, device=device) ) def forward(self, x, t): shift, scale = self.adaLN_modulation(t).chunk(2, dim=1) @@ -316,13 +305,13 @@ class DecoderLayer(nn.Module): The final layer of PixArt. """ - def __init__(self, hidden_size, decoder_hidden_size): + def __init__(self, hidden_size, decoder_hidden_size, dtype=None, device=None, operations=None): super().__init__() - self.norm_decoder = nn.LayerNorm(hidden_size, elementwise_affine=False, eps=1e-6) - self.linear = nn.Linear(hidden_size, decoder_hidden_size, bias=True) + self.norm_decoder = operations.LayerNorm(hidden_size, elementwise_affine=False, eps=1e-6, dtype=dtype, device=device) + self.linear = operations.Linear(hidden_size, decoder_hidden_size, bias=True, dtype=dtype, device=device) self.adaLN_modulation = nn.Sequential( nn.SiLU(), - nn.Linear(hidden_size, 2 * hidden_size, bias=True) + operations.Linear(hidden_size, 2 * hidden_size, bias=True, dtype=dtype, device=device) ) def forward(self, x, t): shift, scale = self.adaLN_modulation(t).chunk(2, dim=1) @@ -339,12 +328,12 @@ class TimestepEmbedder(nn.Module): Embeds scalar timesteps into vector representations. """ - def __init__(self, hidden_size, frequency_embedding_size=256): + def __init__(self, hidden_size, frequency_embedding_size=256, dtype=None, device=None, operations=None): super().__init__() self.mlp = nn.Sequential( - nn.Linear(frequency_embedding_size, hidden_size, bias=True), + operations.Linear(frequency_embedding_size, hidden_size, bias=True, dtype=dtype, device=device), nn.SiLU(), - nn.Linear(hidden_size, hidden_size, bias=True), + operations.Linear(hidden_size, hidden_size, bias=True, dtype=dtype, device=device), ) self.frequency_embedding_size = frequency_embedding_size @@ -368,9 +357,9 @@ def timestep_embedding(t, dim, max_period=10000): embedding = torch.cat([embedding, torch.zeros_like(embedding[:, :1])], dim=-1) return embedding - def forward(self, t): + def forward(self, t, dtype): t_freq = self.timestep_embedding(t, self.frequency_embedding_size) - t_emb = self.mlp(t_freq.to(t.dtype)) + t_emb = self.mlp(t_freq.to(dtype)) return t_emb @@ -379,12 +368,12 @@ class SizeEmbedder(TimestepEmbedder): Embeds scalar timesteps into vector representations. """ - def __init__(self, hidden_size, frequency_embedding_size=256): - super().__init__(hidden_size=hidden_size, frequency_embedding_size=frequency_embedding_size) + def __init__(self, hidden_size, frequency_embedding_size=256, dtype=None, device=None, operations=None): + super().__init__(hidden_size=hidden_size, frequency_embedding_size=frequency_embedding_size, operations=operations) self.mlp = nn.Sequential( - nn.Linear(frequency_embedding_size, hidden_size, bias=True), + operations.Linear(frequency_embedding_size, hidden_size, bias=True, dtype=dtype, device=device), nn.SiLU(), - nn.Linear(hidden_size, hidden_size, bias=True), + operations.Linear(hidden_size, hidden_size, bias=True, dtype=dtype, device=device), ) self.frequency_embedding_size = frequency_embedding_size self.outdim = hidden_size @@ -409,10 +398,10 @@ class LabelEmbedder(nn.Module): Embeds class labels into vector representations. Also handles label dropout for classifier-free guidance. """ - def __init__(self, num_classes, hidden_size, dropout_prob): + def __init__(self, num_classes, hidden_size, dropout_prob, dtype=None, device=None, operations=None): super().__init__() use_cfg_embedding = dropout_prob > 0 - self.embedding_table = nn.Embedding(num_classes + use_cfg_embedding, hidden_size) + self.embedding_table = operations.Embedding(num_classes + use_cfg_embedding, hidden_size, dtype=dtype, device=device), self.num_classes = num_classes self.dropout_prob = dropout_prob @@ -435,14 +424,35 @@ def forward(self, labels, train, force_drop_ids=None): return embeddings +class Mlp(nn.Module): + def __init__(self, in_features, hidden_features=None, out_features=None, act_layer=nn.GELU, bias=True, drop=None, dtype=None, device=None, operations=None) -> None: + super().__init__() + out_features = out_features or in_features + hidden_features = hidden_features or in_features + + self.fc1 = operations.Linear(in_features, hidden_features, bias=bias, dtype=dtype, device=device) + self.act = act_layer() + self.fc2 = operations.Linear(hidden_features, out_features, bias=bias, dtype=dtype, device=device) + + self.drop1 = nn.Identity() + self.drop2 = nn.Identity() + + def forward(self, x: torch.Tensor) -> torch.Tensor: + x = self.act(self.fc1(x)) + return self.fc2(x) + + class CaptionEmbedder(nn.Module): """ Embeds class labels into vector representations. Also handles label dropout for classifier-free guidance. """ - def __init__(self, in_channels, hidden_size, uncond_prob, act_layer=nn.GELU(approximate='tanh'), token_num=120): + def __init__(self, in_channels, hidden_size, uncond_prob, act_layer=nn.GELU(approximate='tanh'), token_num=120, dtype=None, device=None, operations=None): super().__init__() - self.y_proj = Mlp(in_features=in_channels, hidden_features=hidden_size, out_features=hidden_size, act_layer=act_layer, drop=0) + self.y_proj = Mlp( + in_features=in_channels, hidden_features=hidden_size, out_features=hidden_size, act_layer=act_layer, + dtype=dtype, device=device, operations=operations, + ) self.register_buffer("y_embedding", nn.Parameter(torch.randn(token_num, in_channels) / in_channels ** 0.5)) self.uncond_prob = uncond_prob @@ -472,9 +482,12 @@ class CaptionEmbedderDoubleBr(nn.Module): Embeds class labels into vector representations. Also handles label dropout for classifier-free guidance. """ - def __init__(self, in_channels, hidden_size, uncond_prob, act_layer=nn.GELU(approximate='tanh'), token_num=120): + def __init__(self, in_channels, hidden_size, uncond_prob, act_layer=nn.GELU(approximate='tanh'), token_num=120, dtype=None, device=None, operations=None): super().__init__() - self.proj = Mlp(in_features=in_channels, hidden_features=hidden_size, out_features=hidden_size, act_layer=act_layer, drop=0) + self.proj = Mlp( + in_features=in_channels, hidden_features=hidden_size, out_features=hidden_size, act_layer=act_layer, + dtype=dtype, device=device, operations=operations, + ) self.embedding = nn.Parameter(torch.randn(1, in_channels) / 10 ** 0.5) self.y_embedding = nn.Parameter(torch.randn(token_num, in_channels) / 10 ** 0.5) self.uncond_prob = uncond_prob diff --git a/PixArt/models/PixArt.py b/PixArt/models/pixart.py similarity index 96% rename from PixArt/models/PixArt.py rename to PixArt/models/pixart.py index 4d6cf93..a5ca872 100644 --- a/PixArt/models/PixArt.py +++ b/PixArt/models/pixart.py @@ -8,17 +8,23 @@ # GLIDE: https://github.com/openai/glide-text2im # MAE: https://github.com/facebookresearch/mae/blob/main/models_mae.py # -------------------------------------------------------- -import math import torch import torch.nn as nn -import os import numpy as np -from timm.models.layers import DropPath -from timm.models.vision_transformer import PatchEmbed, Mlp - from .utils import auto_grad_checkpoint, to_2tuple -from .PixArt_blocks import t2i_modulate, CaptionEmbedder, AttentionKVCompress, MultiHeadCrossAttention, T2IFinalLayer, TimestepEmbedder, LabelEmbedder, FinalLayer +from .blocks import ( + t2i_modulate, + CaptionEmbedder, + AttentionKVCompress, + MultiHeadCrossAttention, + T2IFinalLayer, + TimestepEmbedder, + LabelEmbedder, + FinalLayer, + Mlp +) +from comfy.ldm.modules.diffusionmodules.mmdit import PatchEmbed class PixArtBlock(nn.Module): @@ -37,7 +43,7 @@ def __init__(self, hidden_size, num_heads, mlp_ratio=4.0, drop_path=0, input_siz # to be compatible with lower version pytorch approx_gelu = lambda: nn.GELU(approximate="tanh") self.mlp = Mlp(in_features=hidden_size, hidden_features=int(hidden_size * mlp_ratio), act_layer=approx_gelu, drop=0) - self.drop_path = DropPath(drop_path) if drop_path > 0. else nn.Identity() + self.drop_path = nn.Identity() #DropPath(drop_path) if drop_path > 0. else nn.Identity() self.scale_shift_table = nn.Parameter(torch.randn(6, hidden_size) / hidden_size ** 0.5) self.sampling = sampling self.sr_ratio = sr_ratio diff --git a/PixArt/models/pixart_controlnet.py b/PixArt/models/pixart_controlnet.py deleted file mode 100644 index 37fa4c1..0000000 --- a/PixArt/models/pixart_controlnet.py +++ /dev/null @@ -1,312 +0,0 @@ -import re -import torch -import torch.nn as nn - -from copy import deepcopy -from torch import Tensor -from torch.nn import Module, Linear, init -from typing import Any, Mapping - -from .PixArt import PixArt, get_2d_sincos_pos_embed -from .PixArtMS import PixArtMSBlock, PixArtMS -from .utils import auto_grad_checkpoint - -# The implementation of ControlNet-Half architrecture -# https://github.com/lllyasviel/ControlNet/discussions/188 -class ControlT2IDitBlockHalf(Module): - def __init__(self, base_block: PixArtMSBlock, block_index: 0) -> None: - super().__init__() - self.copied_block = deepcopy(base_block) - self.block_index = block_index - - for p in self.copied_block.parameters(): - p.requires_grad_(True) - - self.copied_block.load_state_dict(base_block.state_dict()) - self.copied_block.train() - - self.hidden_size = hidden_size = base_block.hidden_size - if self.block_index == 0: - self.before_proj = Linear(hidden_size, hidden_size) - init.zeros_(self.before_proj.weight) - init.zeros_(self.before_proj.bias) - self.after_proj = Linear(hidden_size, hidden_size) - init.zeros_(self.after_proj.weight) - init.zeros_(self.after_proj.bias) - - def forward(self, x, y, t, mask=None, c=None): - - if self.block_index == 0: - # the first block - c = self.before_proj(c) - c = self.copied_block(x + c, y, t, mask) - c_skip = self.after_proj(c) - else: - # load from previous c and produce the c for skip connection - c = self.copied_block(c, y, t, mask) - c_skip = self.after_proj(c) - - return c, c_skip - - -# The implementation of ControlPixArtHalf net -class ControlPixArtHalf(Module): - # only support single res model - def __init__(self, base_model: PixArt, copy_blocks_num: int = 13) -> None: - super().__init__() - self.dtype = torch.get_default_dtype() - self.base_model = base_model.eval() - self.controlnet = [] - self.copy_blocks_num = copy_blocks_num - self.total_blocks_num = len(base_model.blocks) - for p in self.base_model.parameters(): - p.requires_grad_(False) - - # Copy first copy_blocks_num block - for i in range(copy_blocks_num): - self.controlnet.append(ControlT2IDitBlockHalf(base_model.blocks[i], i)) - self.controlnet = nn.ModuleList(self.controlnet) - - def __getattr__(self, name: str) -> Tensor or Module: - if name in ['forward', 'forward_with_dpmsolver', 'forward_with_cfg', 'forward_c', 'load_state_dict']: - return self.__dict__[name] - elif name in ['base_model', 'controlnet']: - return super().__getattr__(name) - else: - return getattr(self.base_model, name) - - def forward_c(self, c): - self.h, self.w = c.shape[-2]//self.patch_size, c.shape[-1]//self.patch_size - pos_embed = torch.from_numpy(get_2d_sincos_pos_embed(self.pos_embed.shape[-1], (self.h, self.w), lewei_scale=self.lewei_scale, base_size=self.base_size)).unsqueeze(0).to(c.device).to(self.dtype) - return self.x_embedder(c) + pos_embed if c is not None else c - - # def forward(self, x, t, c, **kwargs): - # return self.base_model(x, t, c=self.forward_c(c), **kwargs) - def forward_raw(self, x, timestep, y, mask=None, data_info=None, c=None, **kwargs): - # modify the original PixArtMS forward function - if c is not None: - c = c.to(self.dtype) - c = self.forward_c(c) - """ - Forward pass of PixArt. - x: (N, C, H, W) tensor of spatial inputs (images or latent representations of images) - t: (N,) tensor of diffusion timesteps - y: (N, 1, 120, C) tensor of class labels - """ - x = x.to(self.dtype) - timestep = timestep.to(self.dtype) - y = y.to(self.dtype) - pos_embed = self.pos_embed.to(self.dtype) - self.h, self.w = x.shape[-2]//self.patch_size, x.shape[-1]//self.patch_size - x = self.x_embedder(x) + pos_embed # (N, T, D), where T = H * W / patch_size ** 2 - t = self.t_embedder(timestep.to(x.dtype)) # (N, D) - t0 = self.t_block(t) - y = self.y_embedder(y, self.training) # (N, 1, L, D) - if mask is not None: - if mask.shape[0] != y.shape[0]: - mask = mask.repeat(y.shape[0] // mask.shape[0], 1) - mask = mask.squeeze(1).squeeze(1) - y = y.squeeze(1).masked_select(mask.unsqueeze(-1) != 0).view(1, -1, x.shape[-1]) - y_lens = mask.sum(dim=1).tolist() - else: - y_lens = [y.shape[2]] * y.shape[0] - y = y.squeeze(1).view(1, -1, x.shape[-1]) - - # define the first layer - x = auto_grad_checkpoint(self.base_model.blocks[0], x, y, t0, y_lens, **kwargs) # (N, T, D) #support grad checkpoint - - if c is not None: - # update c - for index in range(1, self.copy_blocks_num + 1): - c, c_skip = auto_grad_checkpoint(self.controlnet[index - 1], x, y, t0, y_lens, c, **kwargs) - x = auto_grad_checkpoint(self.base_model.blocks[index], x + c_skip, y, t0, y_lens, **kwargs) - - # update x - for index in range(self.copy_blocks_num + 1, self.total_blocks_num): - x = auto_grad_checkpoint(self.base_model.blocks[index], x, y, t0, y_lens, **kwargs) - else: - for index in range(1, self.total_blocks_num): - x = auto_grad_checkpoint(self.base_model.blocks[index], x, y, t0, y_lens, **kwargs) - - x = self.final_layer(x, t) # (N, T, patch_size ** 2 * out_channels) - x = self.unpatchify(x) # (N, out_channels, H, W) - return x - - def forward(self, x, timesteps, context, cn_hint=None, **kwargs): - """ - Forward pass that adapts comfy input to original forward function - x: (N, C, H, W) tensor of spatial inputs (images or latent representations of images) - timesteps: (N,) tensor of diffusion timesteps - context: (N, 1, 120, C) conditioning - cn_hint: controlnet hint - """ - ## Still accepts the input w/o that dim but returns garbage - if len(context.shape) == 3: - context = context.unsqueeze(1) - - ## run original forward pass - out = self.forward_raw( - x = x.to(self.dtype), - timestep = timesteps.to(self.dtype), - y = context.to(self.dtype), - c = cn_hint, - ) - - ## only return EPS - out = out.to(torch.float) - eps, rest = out[:, :self.in_channels], out[:, self.in_channels:] - return eps - - def forward_with_dpmsolver(self, x, t, y, data_info, c, **kwargs): - model_out = self.forward_raw(x, t, y, data_info=data_info, c=c, **kwargs) - return model_out.chunk(2, dim=1)[0] - - # def forward_with_dpmsolver(self, x, t, y, data_info, c, **kwargs): - # return self.base_model.forward_with_dpmsolver(x, t, y, data_info=data_info, c=self.forward_c(c), **kwargs) - - def forward_with_cfg(self, x, t, y, cfg_scale, data_info, c, **kwargs): - return self.base_model.forward_with_cfg(x, t, y, cfg_scale, data_info, c=self.forward_c(c), **kwargs) - - def load_state_dict(self, state_dict: Mapping[str, Any], strict: bool = True): - if all((k.startswith('base_model') or k.startswith('controlnet')) for k in state_dict.keys()): - return super().load_state_dict(state_dict, strict) - else: - new_key = {} - for k in state_dict.keys(): - new_key[k] = re.sub(r"(blocks\.\d+)(.*)", r"\1.base_block\2", k) - for k, v in new_key.items(): - if k != v: - print(f"replace {k} to {v}") - state_dict[v] = state_dict.pop(k) - - return self.base_model.load_state_dict(state_dict, strict) - - def unpatchify(self, x): - """ - x: (N, T, patch_size**2 * C) - imgs: (N, H, W, C) - """ - c = self.out_channels - p = self.x_embedder.patch_size[0] - assert self.h * self.w == x.shape[1] - - x = x.reshape(shape=(x.shape[0], self.h, self.w, p, p, c)) - x = torch.einsum('nhwpqc->nchpwq', x) - imgs = x.reshape(shape=(x.shape[0], c, self.h * p, self.w * p)) - return imgs - - # @property - # def dtype(self): - ## 返回模型参数的数据类型 - # return next(self.parameters()).dtype - - -# The implementation for PixArtMS_Half + 1024 resolution -class ControlPixArtMSHalf(ControlPixArtHalf): - # support multi-scale res model (multi-scale model can also be applied to single reso training & inference) - def __init__(self, base_model: PixArtMS, copy_blocks_num: int = 13) -> None: - super().__init__(base_model=base_model, copy_blocks_num=copy_blocks_num) - - def forward_raw(self, x, timestep, y, mask=None, data_info=None, c=None, **kwargs): - # modify the original PixArtMS forward function - """ - Forward pass of PixArt. - x: (N, C, H, W) tensor of spatial inputs (images or latent representations of images) - t: (N,) tensor of diffusion timesteps - y: (N, 1, 120, C) tensor of class labels - """ - if c is not None: - c = c.to(self.dtype) - c = self.forward_c(c) - bs = x.shape[0] - x = x.to(self.dtype) - timestep = timestep.to(self.dtype) - y = y.to(self.dtype) - c_size, ar = data_info['img_hw'].to(self.dtype), data_info['aspect_ratio'].to(self.dtype) - self.h, self.w = x.shape[-2]//self.patch_size, x.shape[-1]//self.patch_size - - pos_embed = torch.from_numpy(get_2d_sincos_pos_embed(self.pos_embed.shape[-1], (self.h, self.w), lewei_scale=self.lewei_scale, base_size=self.base_size)).unsqueeze(0).to(x.device).to(self.dtype) - x = self.x_embedder(x) + pos_embed # (N, T, D), where T = H * W / patch_size ** 2 - t = self.t_embedder(timestep) # (N, D) - csize = self.csize_embedder(c_size, bs) # (N, D) - ar = self.ar_embedder(ar, bs) # (N, D) - t = t + torch.cat([csize, ar], dim=1) - t0 = self.t_block(t) - y = self.y_embedder(y, self.training) # (N, D) - if mask is not None: - if mask.shape[0] != y.shape[0]: - mask = mask.repeat(y.shape[0] // mask.shape[0], 1) - mask = mask.squeeze(1).squeeze(1) - y = y.squeeze(1).masked_select(mask.unsqueeze(-1) != 0).view(1, -1, x.shape[-1]) - y_lens = mask.sum(dim=1).tolist() - else: - y_lens = [y.shape[2]] * y.shape[0] - y = y.squeeze(1).view(1, -1, x.shape[-1]) - - # define the first layer - x = auto_grad_checkpoint(self.base_model.blocks[0], x, y, t0, y_lens, **kwargs) # (N, T, D) #support grad checkpoint - - if c is not None: - # update c - for index in range(1, self.copy_blocks_num + 1): - c, c_skip = auto_grad_checkpoint(self.controlnet[index - 1], x, y, t0, y_lens, c, **kwargs) - x = auto_grad_checkpoint(self.base_model.blocks[index], x + c_skip, y, t0, y_lens, **kwargs) - - # update x - for index in range(self.copy_blocks_num + 1, self.total_blocks_num): - x = auto_grad_checkpoint(self.base_model.blocks[index], x, y, t0, y_lens, **kwargs) - else: - for index in range(1, self.total_blocks_num): - x = auto_grad_checkpoint(self.base_model.blocks[index], x, y, t0, y_lens, **kwargs) - - x = self.final_layer(x, t) # (N, T, patch_size ** 2 * out_channels) - x = self.unpatchify(x) # (N, out_channels, H, W) - return x - - def forward(self, x, timesteps, context, img_hw=None, aspect_ratio=None, cn_hint=None, **kwargs): - """ - Forward pass that adapts comfy input to original forward function - x: (N, C, H, W) tensor of spatial inputs (images or latent representations of images) - timesteps: (N,) tensor of diffusion timesteps - context: (N, 1, 120, C) conditioning - img_hw: height|width conditioning - aspect_ratio: aspect ratio conditioning - cn_hint: controlnet hint - """ - ## size/ar from cond with fallback based on the latent image shape. - bs = x.shape[0] - data_info = {} - if img_hw is None: - data_info["img_hw"] = torch.tensor( - [[x.shape[2]*8, x.shape[3]*8]], - dtype=self.dtype, - device=x.device - ).repeat(bs, 1) - else: - data_info["img_hw"] = img_hw.to(x.dtype) - if aspect_ratio is None or True: - data_info["aspect_ratio"] = torch.tensor( - [[x.shape[2]/x.shape[3]]], - dtype=self.dtype, - device=x.device - ).repeat(bs, 1) - else: - data_info["aspect_ratio"] = aspect_ratio.to(x.dtype) - - ## Still accepts the input w/o that dim but returns garbage - if len(context.shape) == 3: - context = context.unsqueeze(1) - - ## run original forward pass - out = self.forward_raw( - x = x.to(self.dtype), - timestep = timesteps.to(self.dtype), - y = context.to(self.dtype), - c = cn_hint, - data_info=data_info, - ) - - ## only return EPS - out = out.to(torch.float) - eps, rest = out[:, :self.in_channels], out[:, self.in_channels:] - return eps diff --git a/PixArt/models/PixArtMS.py b/PixArt/models/pixartms.py similarity index 61% rename from PixArt/models/PixArtMS.py rename to PixArt/models/pixartms.py index 908589c..a9e7035 100644 --- a/PixArt/models/PixArtMS.py +++ b/PixArt/models/pixartms.py @@ -11,12 +11,10 @@ import torch import torch.nn as nn from tqdm import tqdm -from timm.models.layers import DropPath -from timm.models.vision_transformer import Mlp from .utils import auto_grad_checkpoint, to_2tuple -from .PixArt_blocks import t2i_modulate, CaptionEmbedder, AttentionKVCompress, MultiHeadCrossAttention, T2IFinalLayer, TimestepEmbedder, SizeEmbedder -from .PixArt import PixArt, get_2d_sincos_pos_embed +from .blocks import t2i_modulate, CaptionEmbedder, AttentionKVCompress, MultiHeadCrossAttention, T2IFinalLayer, TimestepEmbedder, SizeEmbedder, Mlp +from .pixart import PixArt, get_2d_sincos_pos_embed class PatchEmbed(nn.Module): @@ -31,12 +29,15 @@ def __init__( norm_layer=None, flatten=True, bias=True, + dtype=None, + device=None, + operations=None ): super().__init__() patch_size = to_2tuple(patch_size) self.patch_size = patch_size self.flatten = flatten - self.proj = nn.Conv2d(in_chans, embed_dim, kernel_size=patch_size, stride=patch_size, bias=bias) + self.proj = operations.Conv2d(in_chans, embed_dim, kernel_size=patch_size, stride=patch_size, bias=bias, dtype=dtype, device=device) self.norm = norm_layer(embed_dim) if norm_layer else nn.Identity() def forward(self, x): @@ -52,29 +53,34 @@ class PixArtMSBlock(nn.Module): A PixArt block with adaptive layer norm zero (adaLN-Zero) conditioning. """ def __init__(self, hidden_size, num_heads, mlp_ratio=4.0, drop_path=0., input_size=None, - sampling=None, sr_ratio=1, qk_norm=False, **block_kwargs): + sampling=None, sr_ratio=1, qk_norm=False, dtype=None, device=None, operations=None, **block_kwargs): super().__init__() self.hidden_size = hidden_size - self.norm1 = nn.LayerNorm(hidden_size, elementwise_affine=False, eps=1e-6) + self.norm1 = operations.LayerNorm(hidden_size, elementwise_affine=False, eps=1e-6, dtype=dtype, device=device) self.attn = AttentionKVCompress( hidden_size, num_heads=num_heads, qkv_bias=True, sampling=sampling, sr_ratio=sr_ratio, - qk_norm=qk_norm, **block_kwargs + qk_norm=qk_norm, dtype=dtype, device=device, operations=operations, **block_kwargs ) - self.cross_attn = MultiHeadCrossAttention(hidden_size, num_heads, **block_kwargs) - self.norm2 = nn.LayerNorm(hidden_size, elementwise_affine=False, eps=1e-6) + self.cross_attn = MultiHeadCrossAttention( + hidden_size, num_heads, dtype=dtype, device=device, operations=operations, **block_kwargs + ) + self.norm2 = operations.LayerNorm(hidden_size, elementwise_affine=False, eps=1e-6, dtype=dtype, device=device) # to be compatible with lower version pytorch approx_gelu = lambda: nn.GELU(approximate="tanh") - self.mlp = Mlp(in_features=hidden_size, hidden_features=int(hidden_size * mlp_ratio), act_layer=approx_gelu, drop=0) - self.drop_path = DropPath(drop_path) if drop_path > 0. else nn.Identity() + self.mlp = Mlp( + in_features=hidden_size, hidden_features=int(hidden_size * mlp_ratio), act_layer=approx_gelu, + dtype=dtype, device=device, operations=operations + ) self.scale_shift_table = nn.Parameter(torch.randn(6, hidden_size) / hidden_size ** 0.5) def forward(self, x, y, t, mask=None, HW=None, **kwargs): B, N, C = x.shape + dtype = x.dtype - shift_msa, scale_msa, gate_msa, shift_mlp, scale_mlp, gate_mlp = (self.scale_shift_table[None] + t.reshape(B, 6, -1)).chunk(6, dim=1) - x = x + self.drop_path(gate_msa * self.attn(t2i_modulate(self.norm1(x), shift_msa, scale_msa), HW=HW)) + shift_msa, scale_msa, gate_msa, shift_mlp, scale_mlp, gate_mlp = (self.scale_shift_table[None].to(x.dtype) + t.reshape(B, 6, -1)).chunk(6, dim=1) + x = x + (gate_msa * self.attn(t2i_modulate(self.norm1(x), shift_msa, scale_msa), HW=HW)) x = x + self.cross_attn(x, y, mask) - x = x + self.drop_path(gate_mlp * self.mlp(t2i_modulate(self.norm2(x), shift_mlp, scale_mlp))) + x = x + (gate_mlp * self.mlp(t2i_modulate(self.norm2(x), shift_mlp, scale_mlp))) return x @@ -105,40 +111,52 @@ def __init__( micro_condition=True, qk_norm=False, kv_compress_config=None, + dtype=None, + device=None, + operations=None, **kwargs, ): - super().__init__( - input_size=input_size, - patch_size=patch_size, - in_channels=in_channels, - hidden_size=hidden_size, - depth=depth, - num_heads=num_heads, - mlp_ratio=mlp_ratio, - class_dropout_prob=class_dropout_prob, - learn_sigma=learn_sigma, - pred_sigma=pred_sigma, - drop_path=drop_path, - pe_interpolation=pe_interpolation, - config=config, - model_max_length=model_max_length, - qk_norm=qk_norm, - kv_compress_config=kv_compress_config, - **kwargs, - ) - self.dtype = torch.get_default_dtype() + nn.Module.__init__(self) + self.dtype = dtype + self.pred_sigma = pred_sigma + self.in_channels = in_channels + self.out_channels = in_channels * 2 if pred_sigma else in_channels + self.patch_size = patch_size + self.num_heads = num_heads + self.pe_interpolation = pe_interpolation + self.pe_precision = pe_precision + self.depth = depth + self.h = self.w = 0 approx_gelu = lambda: nn.GELU(approximate="tanh") self.t_block = nn.Sequential( nn.SiLU(), - nn.Linear(hidden_size, 6 * hidden_size, bias=True) + operations.Linear(hidden_size, 6 * hidden_size, bias=True, dtype=dtype, device=device) + ) + self.x_embedder = PatchEmbed( + patch_size, in_channels, hidden_size, bias=True, + dtype=dtype, device=device, operations=operations + ) + self.t_embedder = TimestepEmbedder( + hidden_size, dtype=dtype, device=device, operations=operations, ) - self.x_embedder = PatchEmbed(patch_size, in_channels, hidden_size, bias=True) - self.y_embedder = CaptionEmbedder(in_channels=caption_channels, hidden_size=hidden_size, uncond_prob=class_dropout_prob, act_layer=approx_gelu, token_num=model_max_length) + self.y_embedder = CaptionEmbedder( + in_channels=caption_channels, hidden_size=hidden_size, uncond_prob=class_dropout_prob, + act_layer=approx_gelu, token_num=model_max_length, + dtype=dtype, device=device, operations=operations, + ) + self.micro_conditioning = micro_condition if self.micro_conditioning: - self.csize_embedder = SizeEmbedder(hidden_size//3) # c_size embed - self.ar_embedder = SizeEmbedder(hidden_size//3) # aspect ratio embed + + self.csize_embedder = SizeEmbedder(hidden_size//3, dtype=dtype, device=device, operations=operations) + self.ar_embedder = SizeEmbedder(hidden_size//3, dtype=dtype, device=device, operations=operations) + + # Will use fixed sin-cos embedding: + num_patches = (input_size // patch_size) * (input_size // patch_size) + self.base_size = input_size // self.patch_size + self.register_buffer("pos_embed", torch.zeros(1, num_patches, hidden_size)) + drop_path = [x.item() for x in torch.linspace(0, drop_path, depth)] # stochastic depth decay rule if kv_compress_config is None: kv_compress_config = { @@ -153,12 +171,17 @@ def __init__( sampling=kv_compress_config['sampling'], sr_ratio=int(kv_compress_config['scale_factor']) if i in kv_compress_config['kv_compress_layer'] else 1, qk_norm=qk_norm, + dtype=dtype, + device=device, + operations=operations, ) for i in range(depth) ]) - self.final_layer = T2IFinalLayer(hidden_size, patch_size, self.out_channels) + self.final_layer = T2IFinalLayer( + hidden_size, patch_size, self.out_channels, dtype=dtype, device=device, operations=operations + ) - def forward_raw(self, x, t, y, mask=None, data_info=None, **kwargs): + def forward_orig(self, x, timestep, y, mask=None, data_info=None, **kwargs): """ Original forward pass of PixArt. x: (N, C, H, W) tensor of spatial inputs (images or latent representations of images) @@ -166,9 +189,6 @@ def forward_raw(self, x, t, y, mask=None, data_info=None, **kwargs): y: (N, 1, 120, C) tensor of class labels """ bs = x.shape[0] - x = x.to(self.dtype) - timestep = t.to(self.dtype) - y = y.to(self.dtype) pe_interpolation = self.pe_interpolation if pe_interpolation is None or self.pe_precision is not None: @@ -181,10 +201,10 @@ def forward_raw(self, x, t, y, mask=None, data_info=None, **kwargs): self.pos_embed.shape[-1], (self.h, self.w), pe_interpolation=pe_interpolation, base_size=self.base_size ) - ).unsqueeze(0).to(device=x.device, dtype=self.dtype) + ).to(device=x.device, dtype=x.dtype).unsqueeze(0) x = self.x_embedder(x) + pos_embed # (N, T, D), where T = H * W / patch_size ** 2 - t = self.t_embedder(timestep) # (N, D) + t = self.t_embedder(timestep, x.dtype) # (N, D) if self.micro_conditioning: c_size, ar = data_info['img_hw'].to(self.dtype), data_info['aspect_ratio'].to(self.dtype) @@ -212,46 +232,29 @@ def forward_raw(self, x, t, y, mask=None, data_info=None, **kwargs): return x - def forward(self, x, timesteps, context, img_hw=None, aspect_ratio=None, **kwargs): - """ - Forward pass that adapts comfy input to original forward function - x: (N, C, H, W) tensor of spatial inputs (images or latent representations of images) - timesteps: (N,) tensor of diffusion timesteps - context: (N, 1, 120, C) conditioning - img_hw: height|width conditioning - aspect_ratio: aspect ratio conditioning - """ + def forward(self, x, timesteps, context, width=None, height=None, img_hw=None, aspect_ratio=None, **kwargs): + bs, c, h, w = x.shape + dtype = self.dtype + device = x.device + ## size/ar from cond with fallback based on the latent image shape. bs = x.shape[0] data_info = {} if img_hw is None: - data_info["img_hw"] = torch.tensor( - [[x.shape[2]*8, x.shape[3]*8]], - dtype=self.dtype, - device=x.device - ).repeat(bs, 1) + data_info["img_hw"] = torch.tensor([h*8, w*8], dtype=dtype, device=device).repeat(bs, 1) else: data_info["img_hw"] = img_hw.to(dtype=x.dtype, device=x.device) - if aspect_ratio is None or True: - data_info["aspect_ratio"] = torch.tensor( - [[x.shape[2]/x.shape[3]]], - dtype=self.dtype, - device=x.device - ).repeat(bs, 1) + if aspect_ratio is None: + data_info["aspect_ratio"] = torch.tensor([h/w], dtype=dtype, device=device).repeat(bs, 1) else: - data_info["aspect_ratio"] = aspect_ratio.to(dtype=x.dtype, device=x.device) + data_info["aspect_ratio"] = aspect_ratio.to(dtype=dtype, device=device) ## Still accepts the input w/o that dim but returns garbage if len(context.shape) == 3: context = context.unsqueeze(1) ## run original forward pass - out = self.forward_raw( - x = x.to(self.dtype), - t = timesteps.to(self.dtype), - y = context.to(self.dtype), - data_info=data_info, - ) + out = self.forward_orig(x, timesteps, context, data_info=data_info) ## only return EPS out = out.to(torch.float) diff --git a/PixArt/nodes.py b/PixArt/nodes.py index f027d89..6a1f2dd 100644 --- a/PixArt/nodes.py +++ b/PixArt/nodes.py @@ -1,278 +1,87 @@ -import os -import json -import torch -import folder_paths - -from comfy import utils -from .conf import pixart_conf, pixart_res -from .lora import load_pixart_lora -from .loader import load_pixart - -class PixArtCheckpointLoader: - @classmethod - def INPUT_TYPES(s): - return { - "required": { - "ckpt_name": (folder_paths.get_filename_list("checkpoints"),), - "model": (list(pixart_conf.keys()),), - } - } - RETURN_TYPES = ("MODEL",) - RETURN_NAMES = ("model",) - FUNCTION = "load_checkpoint" - CATEGORY = "ExtraModels/PixArt" - TITLE = "PixArt Checkpoint Loader" - - def load_checkpoint(self, ckpt_name, model): - ckpt_path = folder_paths.get_full_path("checkpoints", ckpt_name) - model_conf = pixart_conf[model] - model = load_pixart( - model_path = ckpt_path, - model_conf = model_conf, - ) - return (model,) - -class PixArtCheckpointLoaderSimple(PixArtCheckpointLoader): - @classmethod - def INPUT_TYPES(s): - return { - "required": { - "ckpt_name": (folder_paths.get_filename_list("checkpoints"),), - } - } - TITLE = "PixArt Checkpoint Loader (auto)" - - def load_checkpoint(self, ckpt_name): - ckpt_path = folder_paths.get_full_path("checkpoints", ckpt_name) - model = load_pixart(model_path=ckpt_path) - return (model,) - -class PixArtResolutionSelect(): - @classmethod - def INPUT_TYPES(s): - return { - "required": { - "model": (list(pixart_res.keys()),), - # keys are the same for both - "ratio": (list(pixart_res["PixArtMS_XL_2"].keys()),{"default":"1.00"}), - } - } - RETURN_TYPES = ("INT","INT") - RETURN_NAMES = ("width","height") - FUNCTION = "get_res" - CATEGORY = "ExtraModels/PixArt" - TITLE = "PixArt Resolution Select" - - def get_res(self, model, ratio): - width, height = pixart_res[model][ratio] - return (width,height) - -class PixArtLoraLoader: - def __init__(self): - self.loaded_lora = None - - @classmethod - def INPUT_TYPES(s): - return { - "required": { - "model": ("MODEL",), - "lora_name": (folder_paths.get_filename_list("loras"), ), - "strength": ("FLOAT", {"default": 1.0, "min": -20.0, "max": 20.0, "step": 0.01}), - } - } - RETURN_TYPES = ("MODEL",) - FUNCTION = "load_lora" - CATEGORY = "ExtraModels/PixArt" - TITLE = "PixArt Load LoRA" - - def load_lora(self, model, lora_name, strength,): - if strength == 0: - return (model) - - lora_path = folder_paths.get_full_path("loras", lora_name) - lora = None - if self.loaded_lora is not None: - if self.loaded_lora[0] == lora_path: - lora = self.loaded_lora[1] - else: - temp = self.loaded_lora - self.loaded_lora = None - del temp - - if lora is None: - lora = utils.load_torch_file(lora_path, safe_load=True) - self.loaded_lora = (lora_path, lora) - - model_lora = load_pixart_lora(model, lora, lora_path, strength,) - return (model_lora,) - class PixArtResolutionCond: - @classmethod - def INPUT_TYPES(s): - return { - "required": { - "cond": ("CONDITIONING", ), - "width": ("INT", {"default": 1024.0, "min": 0, "max": 8192}), - "height": ("INT", {"default": 1024.0, "min": 0, "max": 8192}), - } - } - - RETURN_TYPES = ("CONDITIONING",) - RETURN_NAMES = ("cond",) - FUNCTION = "add_cond" - CATEGORY = "ExtraModels/PixArt" - TITLE = "PixArt Resolution Conditioning" - - def add_cond(self, cond, width, height): - for c in range(len(cond)): - cond[c][1].update({ - "img_hw": [[height, width]], - "aspect_ratio": [[height/width]], - }) - return (cond,) - -class PixArtControlNetCond: - @classmethod - def INPUT_TYPES(s): - return { - "required": { - "cond": ("CONDITIONING",), - "latent": ("LATENT",), - # "image": ("IMAGE",), - # "vae": ("VAE",), - # "strength": ("FLOAT", {"default": 1.0, "min": 0.0, "max": 10.0, "step": 0.01}) - } - } - - RETURN_TYPES = ("CONDITIONING",) - RETURN_NAMES = ("cond",) - FUNCTION = "add_cond" - CATEGORY = "ExtraModels/PixArt" - TITLE = "PixArt ControlNet Conditioning" - - def add_cond(self, cond, latent): - for c in range(len(cond)): - cond[c][1]["cn_hint"] = latent["samples"] * 0.18215 - return (cond,) - -class PixArtT5TextEncode: - """ - Reference code, mostly to verify compatibility. - Once everything works, this should instead inherit from the - T5 text encode node and simply add the extra conds (res/ar). - """ - @classmethod - def INPUT_TYPES(s): - return { - "required": { - "text": ("STRING", {"multiline": True}), - "T5": ("T5",), - } - } - - RETURN_TYPES = ("CONDITIONING",) - FUNCTION = "encode" - CATEGORY = "ExtraModels/PixArt" - TITLE = "PixArt T5 Text Encode [Reference]" - - def mask_feature(self, emb, mask): - if emb.shape[0] == 1: - keep_index = mask.sum().item() - return emb[:, :, :keep_index, :], keep_index - else: - masked_feature = emb * mask[:, None, :, None] - return masked_feature, emb.shape[2] - - def encode(self, text, T5): - text = text.lower().strip() - tokenizer_out = T5.tokenizer.tokenizer( - text, - max_length = 120, - padding = 'max_length', - truncation = True, - return_attention_mask = True, - add_special_tokens = True, - return_tensors = 'pt' - ) - tokens = tokenizer_out["input_ids"] - mask = tokenizer_out["attention_mask"] - embs = T5.cond_stage_model.transformer( - input_ids = tokens.to(T5.load_device), - attention_mask = mask.to(T5.load_device), - )['last_hidden_state'].float()[:, None] - masked_embs, keep_index = self.mask_feature( - embs.detach().to("cpu"), - mask.detach().to("cpu") - ) - masked_embs = masked_embs.squeeze(0) # match CLIP/internal - print("Encoded T5:", masked_embs.shape) - return ([[masked_embs, {}]], ) + @classmethod + def INPUT_TYPES(s): + return { + "required": { + "cond": ("CONDITIONING", ), + "width": ("INT", {"default": 1024.0, "min": 0, "max": 8192}), + "height": ("INT", {"default": 1024.0, "min": 0, "max": 8192}), + } + } + + RETURN_TYPES = ("CONDITIONING",) + RETURN_NAMES = ("cond",) + FUNCTION = "add_cond" + CATEGORY = "ExtraModels/PixArt" + TITLE = "PixArt Resolution Conditioning" + + def add_cond(self, cond, width, height): + for c in range(len(cond)): + cond[c][1].update({ + "img_hw": [[height, width]], + "aspect_ratio": [[height/width]], + }) + return (cond,) class PixArtT5FromSD3CLIP: - """ - Split the T5 text encoder away from SD3 - """ - @classmethod - def INPUT_TYPES(s): - return { - "required": { - "sd3_clip": ("CLIP",), - "padding": ("INT", {"default": 1, "min": 1, "max": 300}), - } - } - - RETURN_TYPES = ("CLIP",) - RETURN_NAMES = ("t5",) - FUNCTION = "split" - CATEGORY = "ExtraModels/PixArt" - TITLE = "PixArt T5 from SD3 CLIP" - - def split(self, sd3_clip, padding): - try: - from comfy.text_encoders.sd3_clip import SD3Tokenizer, SD3ClipModel - except ImportError: - # fallback for older ComfyUI versions - from comfy.sd3_clip import SD3Tokenizer, SD3ClipModel - import copy - - clip = sd3_clip.clone() - assert clip.cond_stage_model.t5xxl is not None, "CLIP must have T5 loaded!" - - # remove transformer - transformer = clip.cond_stage_model.t5xxl.transformer - clip.cond_stage_model.t5xxl.transformer = None - - # clone object - tmp = SD3ClipModel(clip_l=False, clip_g=False, t5=False) - tmp.t5xxl = copy.deepcopy(clip.cond_stage_model.t5xxl) - # put transformer back - clip.cond_stage_model.t5xxl.transformer = transformer - tmp.t5xxl.transformer = transformer - - # override special tokens - tmp.t5xxl.special_tokens = copy.deepcopy(clip.cond_stage_model.t5xxl.special_tokens) - tmp.t5xxl.special_tokens.pop("end") # make sure empty tokens match - - # add attn mask opt if present in original - if hasattr(sd3_clip.cond_stage_model, "t5_attention_mask"): - tmp.t5_attention_mask = False - - # tokenizer - tok = SD3Tokenizer() - tok.t5xxl.min_length = padding - - clip.cond_stage_model = tmp - clip.tokenizer = tok - - return (clip, ) + """ + Split the T5 text encoder away from SD3 + """ + @classmethod + def INPUT_TYPES(s): + return { + "required": { + "sd3_clip": ("CLIP",), + "padding": ("INT", {"default": 1, "min": 1, "max": 300}), + } + } + + RETURN_TYPES = ("CLIP",) + RETURN_NAMES = ("t5",) + FUNCTION = "split" + CATEGORY = "ExtraModels/PixArt" + TITLE = "PixArt T5 from SD3 CLIP" + + def split(self, sd3_clip, padding): + try: + from comfy.text_encoders.sd3_clip import SD3Tokenizer, SD3ClipModel + except ImportError: + # fallback for older ComfyUI versions + from comfy.sd3_clip import SD3Tokenizer, SD3ClipModel # type: ignore + import copy + + clip = sd3_clip.clone() + assert clip.cond_stage_model.t5xxl is not None, "CLIP must have T5 loaded!" + + # remove transformer + transformer = clip.cond_stage_model.t5xxl.transformer + clip.cond_stage_model.t5xxl.transformer = None + + # clone object + tmp = SD3ClipModel(clip_l=False, clip_g=False, t5=False) + tmp.t5xxl = copy.deepcopy(clip.cond_stage_model.t5xxl) + # put transformer back + clip.cond_stage_model.t5xxl.transformer = transformer + tmp.t5xxl.transformer = transformer + + # override special tokens + tmp.t5xxl.special_tokens = copy.deepcopy(clip.cond_stage_model.t5xxl.special_tokens) + tmp.t5xxl.special_tokens.pop("end") # make sure empty tokens match + + # add attn mask opt if present in original + if hasattr(sd3_clip.cond_stage_model, "t5_attention_mask"): + tmp.t5_attention_mask = False + + # tokenizer + tok = SD3Tokenizer() + tok.t5xxl.min_length = padding + + clip.cond_stage_model = tmp + clip.tokenizer = tok + + return (clip, ) NODE_CLASS_MAPPINGS = { - "PixArtCheckpointLoader" : PixArtCheckpointLoader, - "PixArtCheckpointLoaderSimple" : PixArtCheckpointLoaderSimple, - "PixArtResolutionSelect" : PixArtResolutionSelect, - "PixArtLoraLoader" : PixArtLoraLoader, - "PixArtT5TextEncode" : PixArtT5TextEncode, - "PixArtResolutionCond" : PixArtResolutionCond, - "PixArtControlNetCond" : PixArtControlNetCond, - "PixArtT5FromSD3CLIP": PixArtT5FromSD3CLIP, + "PixArtResolutionCond" : PixArtResolutionCond, + "PixArtT5FromSD3CLIP": PixArtT5FromSD3CLIP, } diff --git a/Sana/conf.py b/Sana/conf.py deleted file mode 100644 index 3719c0f..0000000 --- a/Sana/conf.py +++ /dev/null @@ -1,98 +0,0 @@ -""" -List of all Sana model types / settings -""" - -sampling_settings = { - "shift": 3.0, -} - -sana_conf = { - "SanaMS_600M_P1_D28": { - "target": "SanaMS", - "unet_config": { - "in_channels": 32, - "depth": 28, - "hidden_size": 1152, - "patch_size": 1, - "num_heads": 16, - "linear_head_dim": 32, - "model_max_length": 300, - "y_norm": True, - "attn_type": "linear", - "ffn_type": "glumbconv", - "mlp_ratio": 2.5, - "mlp_acts": ["silu", "silu", None], - "use_pe": False, - "pred_sigma": False, - "learn_sigma": False, - "fp32_attention": True, - }, - "sampling_settings" : sampling_settings, - }, - "SanaMS_1600M_P1_D20": { - "target": "SanaMS", - "unet_config": { - "in_channels": 32, - "depth": 20, - "hidden_size": 2240, - "patch_size": 1, - "num_heads": 20, - "linear_head_dim": 32, - "model_max_length": 300, - "y_norm": True, - "attn_type": "linear", - "ffn_type": "glumbconv", - "mlp_ratio": 2.5, - "mlp_acts": ["silu", "silu", None], - "use_pe": False, - "pred_sigma": False, - "learn_sigma": False, - "fp32_attention": True, - }, - "sampling_settings" : sampling_settings, - }, -} - -sana_res = { - "1024px": { # models/SanaMS 1024x1024 - '0.25': [512, 2048], '0.26': [512, 1984], '0.27': [512, 1920], '0.28': [512, 1856], - '0.32': [576, 1792], '0.33': [576, 1728], '0.35': [576, 1664], '0.40': [640, 1600], - '0.42': [640, 1536], '0.48': [704, 1472], '0.50': [704, 1408], '0.52': [704, 1344], - '0.57': [768, 1344], '0.60': [768, 1280], '0.68': [832, 1216], '0.72': [832, 1152], - '0.78': [896, 1152], '0.82': [896, 1088], '0.88': [960, 1088], '0.94': [960, 1024], - '1.00': [1024,1024], '1.07': [1024, 960], '1.13': [1088, 960], '1.21': [1088, 896], - '1.29': [1152, 896], '1.38': [1152, 832], '1.46': [1216, 832], '1.67': [1280, 768], - '1.75': [1344, 768], '2.00': [1408, 704], '2.09': [1472, 704], '2.40': [1536, 640], - '2.50': [1600, 640], '2.89': [1664, 576], '3.00': [1728, 576], '3.11': [1792, 576], - '3.62': [1856, 512], '3.75': [1920, 512], '3.88': [1984, 512], '4.00': [2048, 512], - }, - "512px": { # models/SanaMS 512x512 - '0.25': [256,1024], '0.26': [256, 992], '0.27': [256, 960], '0.28': [256, 928], - '0.32': [288, 896], '0.33': [288, 864], '0.35': [288, 832], '0.40': [320, 800], - '0.42': [320, 768], '0.48': [352, 736], '0.50': [352, 704], '0.52': [352, 672], - '0.57': [384, 672], '0.60': [384, 640], '0.68': [416, 608], '0.72': [416, 576], - '0.78': [448, 576], '0.82': [448, 544], '0.88': [480, 544], '0.94': [480, 512], - '1.00': [512, 512], '1.07': [512, 480], '1.13': [544, 480], '1.21': [544, 448], - '1.29': [576, 448], '1.38': [576, 416], '1.46': [608, 416], '1.67': [640, 384], - '1.75': [672, 384], '2.00': [704, 352], '2.09': [736, 352], '2.40': [768, 320], - '2.50': [800, 320], '2.89': [832, 288], '3.00': [864, 288], '3.11': [896, 288], - '3.62': [928, 256], '3.75': [960, 256], '3.88': [992, 256], '4.00': [1024,256] - }, - "2K": { - '0.25': [1024, 4096], '0.26': [1024, 3968], '0.27': [1024, 3840], '0.28': [1024, 3712], - '0.32': [1152, 3584], '0.33': [1152, 3456], '0.35': [1152, 3328], '0.40': [1280, 3200], - '0.42': [1280, 3072], '0.48': [1408, 2944], '0.50': [1408, 2816], '0.52': [1408, 2688], - '0.57': [1536, 2688], '0.60': [1536, 2560], '0.68': [1664, 2432], '0.72': [1664, 2304], - '0.78': [1792, 2304], '0.82': [1792, 2176], '0.88': [1920, 2176], '0.94': [1920, 2048], - '1.00': [2048, 2048], '1.07': [2048, 1920], '1.13': [2176, 1920], '1.21': [2176, 1792], - '1.29': [2304, 1792], '1.38': [2304, 1664], '1.46': [2432, 1664], '1.67': [2560, 1536], - '1.75': [2688, 1536], '2.00': [2816, 1408], '2.09': [2944, 1408], '2.40': [3072, 1280], - '2.50': [3200, 1280], '2.89': [3328, 1152], '3.00': [3456, 1152], '3.11': [3584, 1152], - '3.62': [3712, 1024], '3.75': [3840, 1024], '3.88': [3968, 1024], '4.00': [4096, 1024] - } -} -# These should be the same -sana_res.update({ - "SanaMS_600M_P1_D28": sana_res["1024px"], - "SanaMS_1600M_P1_D20": sana_res["1024px"], -}) diff --git a/Sana/loader.py b/Sana/loader.py index 806ca64..413ffb0 100644 --- a/Sana/loader.py +++ b/Sana/loader.py @@ -1,100 +1,146 @@ +import math +import logging + +import comfy.utils +import comfy.model_base +import comfy.model_detection + import comfy.supported_models_base +import comfy.supported_models import comfy.latent_formats -import comfy.model_patcher -import comfy.model_base -import comfy.utils -import comfy.conds -import torch -import math -from comfy import model_management -from comfy.latent_formats import LatentFormat -from .diffusers_convert import convert_state_dict +from .models.sana import Sana +from .models.sana_multi_scale import SanaMS +from .diffusers_convert import convert_state_dict +from ..utils.loader import load_state_dict_from_config -class SanaLatent(LatentFormat): +class SanaLatent(comfy.latent_formats.LatentFormat): latent_channels = 32 def __init__(self): self.scale_factor = 0.41407 - - -class EXM_Sana(comfy.supported_models_base.BASE): - unet_config = {} - unet_extra_config = {} - latent_format = SanaLatent - - def __init__(self, model_conf): - self.model_target = model_conf.get("target") - self.unet_config = model_conf.get("unet_config", {}) - self.sampling_settings = model_conf.get("sampling_settings", {}) - self.latent_format = self.latent_format() - # UNET is handled by extension - self.unet_config["disable_unet_model_creation"] = True - - def model_type(self, state_dict, prefix=""): - return comfy.model_base.ModelType.FLOW - - -class EXM_Sana_Model(comfy.model_base.BaseModel): - def __init__(self, *args, **kwargs): - super().__init__(*args, **kwargs) - - def extra_conds(self, **kwargs): - out = super().extra_conds(**kwargs) - - cn_hint = kwargs.get("cn_hint", None) - if cn_hint is not None: - out["cn_hint"] = comfy.conds.CONDRegular(cn_hint) - - return out - - -def load_sana(model_path, model_conf): - state_dict = comfy.utils.load_torch_file(model_path) - state_dict = state_dict.get("model", state_dict) - - # prefix - for prefix in ["model.diffusion_model.",]: - if any(True for x in state_dict if x.startswith(prefix)): - state_dict = {k[len(prefix):]:v for k,v in state_dict.items()} - - # diffusers - if "adaln_single.linear.weight" in state_dict: - state_dict = convert_state_dict(state_dict) # Diffusers - - parameters = comfy.utils.calculate_parameters(state_dict) - unet_dtype = comfy.model_management.unet_dtype() - load_device = comfy.model_management.get_torch_device() - offload_device = comfy.model_management.unet_offload_device() - - # ignore fp8/etc and use directly for now - manual_cast_dtype = model_management.unet_manual_cast(unet_dtype, load_device) - if manual_cast_dtype: - print(f"Sana: falling back to {manual_cast_dtype}") - unet_dtype = manual_cast_dtype - - model_conf = EXM_Sana(model_conf) # convert to object - model = EXM_Sana_Model( # same as comfy.model_base.BaseModel - model_conf, - model_type=comfy.model_base.ModelType.FLOW, - device=model_management.get_torch_device() - ) - - if model_conf.model_target == "SanaMS": - from .models.sana_multi_scale import SanaMS - model.diffusion_model = SanaMS(**model_conf.unet_config) - else: - raise NotImplementedError(f"Unknown model target '{model_conf.model_target}'") - - m, u = model.diffusion_model.load_state_dict(state_dict, strict=False) - if len(m) > 0: print("Missing UNET keys", m) - if len(u) > 0: print("Leftover UNET keys", u) - model.diffusion_model.dtype = unet_dtype - model.diffusion_model.eval() - model.diffusion_model.to(unet_dtype) - - model_patcher = comfy.model_patcher.ModelPatcher( - model, - load_device = load_device, - offload_device = offload_device, - ) - return model_patcher + self.latent_rgb_factors = latent_rgb_factors.copy() + self.latent_rgb_factors_bias = latent_rgb_factors_bias.copy() + +class SanaConfig(comfy.supported_models_base.BASE): + unet_class = SanaMS + unet_config = {} + unet_extra_config = {} + + latent_format = SanaLatent + sampling_settings = { + "shift": 3.0, + } + + def model_type(self, state_dict, prefix=""): + return comfy.model_base.ModelType.FLOW + + def get_model(self, state_dict, prefix="", device=None): + return SanaModel(model_config=self, unet_model=self.unet_class, device=device) + +class SanaModel(comfy.model_base.BaseModel): + def __init__(self, *args, model_type=comfy.model_base.ModelType.FLOW, unet_model=SanaMS, **kwargs): + super().__init__(*args, model_type=model_type, unet_model=unet_model, **kwargs) + +def load_sana_state_dict(sd, model_options={}): + # prefix / format + sd = sd.get("model", sd) # ref ckpt + diffusion_model_prefix = comfy.model_detection.unet_prefix_from_state_dict(sd) + temp_sd = comfy.utils.state_dict_prefix_replace(sd, {diffusion_model_prefix: ""}, filter_keys=True) + if len(temp_sd) > 0: + sd = temp_sd + + # diffusers convert + if "adaln_single.linear.weight" in sd: + sd = convert_state_dict(sd) + + # model config + model_config = model_config_from_unet(sd) + return load_state_dict_from_config(model_config, sd, model_options) + +def model_config_from_unet(sd): + """ + Guess config based on (converted) state dict. + """ + # shared settings that match between all models + # TODO: some can (should) be enumerated + config = { + "in_channels": 32, + "linear_head_dim": 32, + "model_max_length": 300, + "y_norm": True, + "attn_type": "linear", + "ffn_type": "glumbconv", + "mlp_ratio": 2.5, + "mlp_acts": ["silu", "silu", None], + "use_pe": False, + "pred_sigma": False, + "learn_sigma": False, + "fp32_attention": True, + "patch_size": 1, + } + config["depth"] = sum([key.endswith(".point_conv.conv.weight") for key in sd.keys()]) or 28 + + if "pos_embed" in sd: + config["input_size"] = int(math.sqrt(sd["pos_embed"].shape[1])) * config["patch_size"] + else: + # TODO: this isn't optimal though most models don't use it + config["use_pe"] = False + + if "x_embedder.proj.bias" in sd: + config["hidden_size"] = sd["x_embedder.proj.bias"].shape[0] + + if config["hidden_size"] == 1152: + config["num_heads"] = 16 + elif config["hidden_size"] == 2240: + config["num_heads"] = 20 + else: + raise RuntimeError(f"Unknown model config.") + + model_config = SanaConfig(config) + logging.debug(f"Sana config:\n{config}") + return model_config + +# for fast latent preview +latent_rgb_factors = [ + [-2.0022e-03, -6.0736e-03, -1.7096e-03], + [ 1.4221e-03, 3.6703e-03, 4.1083e-03], + [ 1.0081e-02, 2.6456e-04, -1.4333e-02], + [-2.4253e-03, 3.0967e-03, -1.0301e-03], + [ 2.2158e-03, 7.7274e-03, -1.3151e-02], + [ 1.1235e-02, 5.7630e-03, 3.6146e-03], + [-7.2899e-02, 1.1062e-02, 3.6103e-02], + [ 3.2346e-02, 2.8678e-02, 2.5014e-02], + [ 1.6469e-03, -1.1364e-03, 2.8366e-03], + [-3.5597e-02, -2.3447e-02, -3.1172e-03], + [-1.9985e-04, -2.0647e-03, -1.2702e-02], + [ 2.1318e-04, 1.2196e-03, -8.3461e-04], + [ 1.3766e-02, 2.7559e-03, 7.3567e-03], + [ 1.3027e-02, 2.6365e-03, 3.0405e-03], + [ 1.5335e-02, 9.4682e-03, 6.7312e-03], + [ 5.1827e-03, -9.4865e-03, 8.5080e-03], + [ 1.4365e-02, -3.2867e-03, 9.5108e-03], + [-4.1216e-03, -1.9177e-03, -3.3726e-03], + [-2.4757e-03, 5.1739e-04, 2.0280e-03], + [-3.5950e-03, 1.0720e-03, 5.3043e-03], + [-5.1758e-03, 8.1040e-03, -3.7564e-02], + [-3.8555e-03, -1.5529e-03, -3.5799e-03], + [-6.6175e-03, -6.8484e-03, -9.9609e-03], + [-2.1656e-03, 5.5770e-05, 1.4936e-03], + [-9.2857e-02, -1.1379e-01, -1.0919e-01], + [ 7.7044e-04, 5.5594e-03, 3.4755e-02], + [ 1.2714e-02, 2.9729e-02, 3.1989e-03], + [-1.1805e-03, 9.0548e-03, -4.1063e-04], + [ 8.3309e-04, 4.9694e-03, 2.3087e-03], + [ 7.8456e-03, 3.9750e-03, 3.5655e-03], + [-1.7552e-03, 4.9306e-03, 1.4210e-02], + [-1.4790e-03, 2.8837e-03, -4.5687e-03] +] +latent_rgb_factors_bias = [0.4358, 0.3814, 0.3388] + +# 512/1024/2K match, TODO: 4K is new, add on release +from ..PixArt.loader import resolutions as pixart_res +resolutions = { + "Sana 512": pixart_res["PixArt 512"], + "Sana 1024": pixart_res["PixArt 1024"], + "Sana 2K": pixart_res["PixArt 2K"], +} diff --git a/Sana/models/__init__.py b/Sana/models/__init__.py new file mode 100644 index 0000000..e69de29 diff --git a/Sana/models/act.py b/Sana/models/act.py index 9df6a7a..7993065 100644 --- a/Sana/models/act.py +++ b/Sana/models/act.py @@ -37,7 +37,7 @@ } -def build_act(name: str or None, **kwargs) -> nn.Module or None: +def build_act(name, **kwargs): if name in REGISTERED_ACT_DICT: act_cls, default_args = copy.deepcopy(REGISTERED_ACT_DICT[name]) for key in default_args: @@ -50,7 +50,7 @@ def build_act(name: str or None, **kwargs) -> nn.Module or None: raise ValueError(f"do not support: {name}") -def get_act_name(act: nn.Module or None) -> str or None: +def get_act_name(act): if act is None: return None module2name = {} diff --git a/Sana/models/basic_modules.py b/Sana/models/basic_modules.py index ece579a..6b279a4 100644 --- a/Sana/models/basic_modules.py +++ b/Sana/models/basic_modules.py @@ -17,13 +17,31 @@ # This file is modified from https://github.com/PixArt-alpha/PixArt-sigma import torch import torch.nn as nn -from timm.models.vision_transformer import Mlp +#from timm.models.vision_transformer import Mlp from .act import build_act, get_act_name from .norms import build_norm, get_norm_name from .utils import get_same_padding, val2tuple +class Mlp(nn.Module): + def __init__(self, in_features, hidden_features=None, out_features=None, act_layer=nn.GELU, bias=True, drop=None, dtype=None, device=None, operations=None) -> None: + super().__init__() + out_features = out_features or in_features + hidden_features = hidden_features or in_features + + self.fc1 = operations.Linear(in_features, hidden_features, bias=bias, dtype=dtype, device=device) + self.act = act_layer() + self.fc2 = operations.Linear(hidden_features, out_features, bias=bias, dtype=dtype, device=device) + + self.drop1 = nn.Identity() + self.drop2 = nn.Identity() + + def forward(self, x: torch.Tensor) -> torch.Tensor: + x = self.act(self.fc1(x)) + return self.fc2(x) + + class ConvLayer(nn.Module): def __init__( self, @@ -33,11 +51,14 @@ def __init__( stride=1, dilation=1, groups=1, - padding: int or None = None, + padding=None, use_bias=False, dropout=0.0, norm="bn2d", act="relu", + dtype=None, + device=None, + operations=None, ): super().__init__() if padding is None: @@ -54,7 +75,7 @@ def __init__( self.use_bias = use_bias self.dropout = nn.Dropout2d(dropout, inplace=False) if dropout > 0 else None - self.conv = nn.Conv2d( + self.conv = operations.Conv2d( in_dim, out_dim, kernel_size=(kernel_size, kernel_size), @@ -63,6 +84,8 @@ def __init__( dilation=(dilation, dilation), groups=groups, bias=use_bias, + dtype=dtype, + device=device, ) self.norm = build_norm(norm, num_features=out_dim) self.act = build_act(act) @@ -86,11 +109,14 @@ def __init__( out_feature=None, kernel_size=3, stride=1, - padding: int or None = None, + padding=None, use_bias=False, norm=(None, None, None), act=("silu", "silu", None), dilation=1, + dtype=None, + device=None, + operations=None, ): out_feature = out_feature or in_features super().__init__() @@ -106,6 +132,9 @@ def __init__( use_bias=use_bias[0], norm=norm[0], act=act[0], + dtype=dtype, + device=device, + operations=operations, ) self.depth_conv = ConvLayer( hidden_features * 2, @@ -118,6 +147,9 @@ def __init__( norm=norm[1], act=None, dilation=dilation, + dtype=dtype, + device=device, + operations=operations, ) self.point_conv = ConvLayer( hidden_features, @@ -126,6 +158,9 @@ def __init__( use_bias=use_bias[2], norm=norm[2], act=act[2], + dtype=dtype, + device=device, + operations=operations, ) # from IPython import embed; embed(header='debug dilate conv') @@ -189,10 +224,13 @@ def __init__( stride=1, mid_dim=None, expand=6, - padding: int or None = None, + padding=None, use_bias=False, norm=(None, None, "ln2d"), act=("silu", "silu", None), + dtype=None, + device=None, + operations=None, ): super().__init__() use_bias = val2tuple(use_bias, 3) @@ -208,6 +246,9 @@ def __init__( use_bias=use_bias[0], norm=norm[0], act=None, + dtype=dtype, + device=device, + operations=operations, ) self.glu_act = build_act(act[0], inplace=False) self.depth_conv = ConvLayer( @@ -220,6 +261,9 @@ def __init__( use_bias=use_bias[1], norm=norm[1], act=act[1], + dtype=dtype, + device=device, + operations=operations, ) self.point_conv = ConvLayer( mid_dim, @@ -228,6 +272,9 @@ def __init__( use_bias=use_bias[2], norm=norm[2], act=act[2], + dtype=dtype, + device=device, + operations=operations, ) def forward(self, x: torch.Tensor, HW=None) -> torch.Tensor: @@ -283,6 +330,9 @@ def __init__( stride=1, dilation=1, padding=None, + dtype=None, + device=None, + operations=None, ): super().__init__( in_features=in_features, @@ -291,6 +341,9 @@ def __init__( act_layer=act_layer, bias=bias, drop=drop, + dtype=dtype, + device=device, + operations=operations, ) hidden_features = hidden_features or in_features self.hidden_features = hidden_features @@ -298,7 +351,7 @@ def __init__( padding = get_same_padding(kernel_size) padding *= dilation - self.conv = nn.Conv2d( + self.conv = operations.Conv2d( hidden_features, hidden_features, kernel_size=(kernel_size, kernel_size), @@ -307,6 +360,8 @@ def __init__( dilation=(dilation, dilation), groups=hidden_features, bias=bias, + dtype=dtype, + device=device, ) def forward(self, x, HW=None): @@ -324,38 +379,3 @@ def forward(self, x, HW=None): x = self.fc2(x) x = self.drop2(x) return x - - -class Mlp(Mlp): - """MLP as used in Vision Transformer, MLP-Mixer and related networks""" - - def __init__(self, in_features, hidden_features=None, out_features=None, act_layer=nn.GELU, bias=True, drop=0.0): - super().__init__( - in_features=in_features, - hidden_features=hidden_features, - out_features=out_features, - act_layer=act_layer, - bias=bias, - drop=drop, - ) - - def forward(self, x, HW=None): - x = self.fc1(x) - x = self.act(x) - x = self.drop1(x) - x = self.fc2(x) - x = self.drop2(x) - return x - - -if __name__ == "__main__": - model = GLUMBConv( - 1152, - 1152 * 4, - 1152, - use_bias=(True, True, False), - norm=(None, None, None), - act=("silu", "silu", None), - ).cuda() - input = torch.randn(4, 256, 1152).cuda() - output = model(input) diff --git a/Sana/models/norms.py b/Sana/models/norms.py index 4731d69..5174fdc 100644 --- a/Sana/models/norms.py +++ b/Sana/models/norms.py @@ -48,7 +48,7 @@ def extra_repr(self) -> str: } -def build_norm(name="bn2d", num_features=None, affine=True, **kwargs) -> nn.Module or None: +def build_norm(name="bn2d", num_features=None, affine=True, **kwargs): if name in ["ln", "ln2d"]: kwargs["normalized_shape"] = num_features kwargs["elementwise_affine"] = affine @@ -67,7 +67,7 @@ def build_norm(name="bn2d", num_features=None, affine=True, **kwargs) -> nn.Modu raise ValueError("do not support: %s" % name) -def get_norm_name(norm: nn.Module or None) -> str or None: +def get_norm_name(norm): if norm is None: return None module2name = {} @@ -171,7 +171,7 @@ def remove_bn(model: nn.Module) -> None: m.forward = lambda x: x -def set_norm_eps(model: nn.Module, eps: float or None = None, momentum: float or None = None) -> None: +def set_norm_eps(model, eps=None, momentum=None): for m in model.modules(): if isinstance(m, (nn.GroupNorm, nn.LayerNorm, _BatchNorm)): if eps is not None: diff --git a/Sana/models/sana.py b/Sana/models/sana.py index 81c2211..113adfb 100644 --- a/Sana/models/sana.py +++ b/Sana/models/sana.py @@ -20,7 +20,6 @@ import numpy as np import torch import torch.nn as nn -from timm.models.layers import DropPath from .basic_modules import DWMlp, GLUMBConv, MBConvPreGLU, Mlp from .sana_blocks import ( @@ -35,7 +34,7 @@ t2i_modulate, ) from .norms import RMSNorm -from .utils import auto_grad_checkpoint, to_2tuple +from .utils import to_2tuple class SanaBlock(nn.Module): @@ -55,10 +54,13 @@ def __init__( ffn_type="mlp", mlp_acts=("silu", "silu", None), linear_head_dim=32, + dtype=None, + device=None, + operations=None, **block_kwargs, ): super().__init__() - self.norm1 = nn.LayerNorm(hidden_size, elementwise_affine=False, eps=1e-6) + self.norm1 = operations.LayerNorm(hidden_size, elementwise_affine=False, eps=1e-6, dtype=dtype, device=device) if attn_type == "flash": # flash self attention self.attn = FlashAttention( @@ -66,20 +68,28 @@ def __init__( num_heads=num_heads, qkv_bias=True, qk_norm=qk_norm, + dtype=dtype, + device=device, + operations=operations, **block_kwargs, ) elif attn_type == "linear": # linear self attention # TODO: Here the num_heads set to 36 for tmp used self_num_heads = hidden_size // linear_head_dim - self.attn = LiteLA(hidden_size, hidden_size, heads=self_num_heads, eps=1e-8, qk_norm=qk_norm) + self.attn = LiteLA( + hidden_size, hidden_size, heads=self_num_heads, eps=1e-8, qk_norm=qk_norm, + dtype=dtype, device=device, operations=operations, + ) elif attn_type == "vanilla": # vanilla self attention - self.attn = Attention(hidden_size, num_heads=num_heads, qkv_bias=True) + self.attn = Attention( + hidden_size, num_heads=num_heads, qkv_bias=True, dtype=dtype, device=device, operations=operations, + ) else: raise ValueError(f"{attn_type} type is not defined.") - self.cross_attn = MultiHeadCrossAttention(hidden_size, num_heads, **block_kwargs) + self.cross_attn = MultiHeadCrossAttention(hidden_size, num_heads, dtype=dtype, device=device, operations=operations, **block_kwargs) self.norm2 = nn.LayerNorm(hidden_size, elementwise_affine=False, eps=1e-6) # to be compatible with lower version pytorch if ffn_type == "dwmlp": @@ -94,6 +104,9 @@ def __init__( use_bias=(True, True, False), norm=(None, None, None), act=mlp_acts, + dtype=dtype, + device=device, + operations=operations, ) elif ffn_type == "glumbconv_dilate": self.mlp = GLUMBConv( @@ -103,6 +116,9 @@ def __init__( norm=(None, None, None), act=mlp_acts, dilation=2, + dtype=dtype, + device=device, + operations=operations, ) elif ffn_type == "mbconvpreglu": self.mlp = MBConvPreGLU( @@ -112,15 +128,19 @@ def __init__( use_bias=(True, True, False), norm=None, act=("silu", "silu", None), + dtype=dtype, + device=device, + operations=operations, ) elif ffn_type == "mlp": approx_gelu = lambda: nn.GELU(approximate="tanh") self.mlp = Mlp( - in_features=hidden_size, hidden_features=int(hidden_size * mlp_ratio), act_layer=approx_gelu, drop=0 + in_features=hidden_size, hidden_features=int(hidden_size * mlp_ratio), act_layer=approx_gelu, drop=0, + # dtype=dtype, device=device, operations=operations, ) else: raise ValueError(f"{ffn_type} type is not defined.") - self.drop_path = DropPath(drop_path) if drop_path > 0.0 else nn.Identity() + self.drop_path = nn.Identity() #DropPath(drop_path) if drop_path > 0.0 else nn.Identity() self.scale_shift_table = nn.Parameter(torch.randn(6, hidden_size) / hidden_size**0.5) def forward(self, x, y, t, mask=None, **kwargs): @@ -146,7 +166,7 @@ class Sana(nn.Module): def __init__( self, - input_size=32, + input_size=None, patch_size=1, in_channels=32, hidden_size=1152, @@ -170,9 +190,13 @@ def __init__( patch_embed_kernel=None, mlp_acts=("silu", "silu", None), linear_head_dim=32, + dtype=None, + device=None, + operations=None, **kwargs, ): super().__init__() + self.dtype = dtype self.pred_sigma = pred_sigma self.in_channels = in_channels self.out_channels = in_channels * 2 if pred_sigma else in_channels @@ -187,22 +211,36 @@ def __init__( kernel_size = patch_embed_kernel or patch_size self.x_embedder = PatchEmbed( - input_size, patch_size, in_channels, hidden_size, kernel_size=kernel_size, bias=True + input_size, patch_size, in_channels, hidden_size, kernel_size=kernel_size, bias=True, + dtype=dtype, device=device, operations=operations ) - self.t_embedder = TimestepEmbedder(hidden_size) + self.t_embedder = TimestepEmbedder(hidden_size, dtype=dtype, device=device, operations=operations) + + if input_size is not None: + self.base_size = input_size // self.patch_size + else: + self.base_size = None + num_patches = self.x_embedder.num_patches - self.base_size = input_size // self.patch_size - # Will use fixed sin-cos embedding: - self.register_buffer("pos_embed", torch.zeros(1, num_patches, hidden_size)) + if self.use_pe and num_patches is not None: + #Will use fixed sin-cos embedding: + self.register_buffer("pos_embed", torch.zeros(1, num_patches, hidden_size)) + else: + self.pos_embed = None approx_gelu = lambda: nn.GELU(approximate="tanh") - self.t_block = nn.Sequential(nn.SiLU(), nn.Linear(hidden_size, 6 * hidden_size, bias=True)) + self.t_block = nn.Sequential( + nn.SiLU(), operations.Linear(hidden_size, 6 * hidden_size, bias=True, dtype=dtype, device=device) + ) self.y_embedder = CaptionEmbedder( in_channels=caption_channels, hidden_size=hidden_size, uncond_prob=class_dropout_prob, act_layer=approx_gelu, token_num=model_max_length, + dtype=dtype, + device=device, + operations=operations ) if self.y_norm: self.attention_y_norm = RMSNorm(hidden_size, scale_factor=y_norm_scale_factor, eps=norm_eps) @@ -220,29 +258,32 @@ def __init__( ffn_type=ffn_type, mlp_acts=mlp_acts, linear_head_dim=linear_head_dim, + dtype=dtype, + device=device, + operations=operations ) for i in range(depth) ] ) - self.final_layer = T2IFinalLayer(hidden_size, patch_size, self.out_channels) + self.final_layer = T2IFinalLayer( + hidden_size, patch_size, self.out_channels, dtype=dtype, device=device, operations=operations + ) - def forward(self, x, timestep, y, mask=None, data_info=None, **kwargs): + def forward(self, x, timestep, context, mask=None, data_info=None, **kwargs): """ Forward pass of Sana. x: (N, C, H, W) tensor of spatial inputs (images or latent representations of images) t: (N,) tensor of diffusion timesteps y: (N, 1, 120, C) tensor of class labels """ - x = x.to(self.dtype) - timestep = timestep.to(self.dtype) - y = y.to(self.dtype) - pos_embed = self.pos_embed.to(self.dtype) + y = context # remap comfy cond name self.h, self.w = x.shape[-2] // self.patch_size, x.shape[-1] // self.patch_size if self.use_pe: + pos_embed = self.pos_embed.to(x.dtype) x = self.x_embedder(x) + pos_embed # (N, T, D), where T = H * W / patch_size ** 2 else: x = self.x_embedder(x) - t = self.t_embedder(timestep.to(x.dtype)) # (N, D) + t = self.t_embedder(timestep, x.dtype) # (N, D) t0 = self.t_block(t) y = self.y_embedder(y, self.training) # (N, 1, L, D) if self.y_norm: @@ -257,7 +298,7 @@ def forward(self, x, timestep, y, mask=None, data_info=None, **kwargs): y_lens = [y.shape[2]] * y.shape[0] y = y.squeeze(1).view(1, -1, x.shape[-1]) for block in self.blocks: - x = auto_grad_checkpoint(block, x, y, t0, y_lens) # (N, T, D) #support grad checkpoint + x = block(x, y, t0, y_lens) # (N, T, D) x = self.final_layer(x, t) # (N, T, patch_size ** 2 * out_channels) x = self.unpatchify(x) # (N, out_channels, H, W) return x @@ -292,39 +333,6 @@ def unpatchify(self, x): imgs = x.reshape(shape=(x.shape[0], c, h * p, h * p)) return imgs - def initialize_weights(self): - # Initialize transformer layers: - def _basic_init(module): - if isinstance(module, nn.Linear): - torch.nn.init.xavier_uniform_(module.weight) - if module.bias is not None: - nn.init.constant_(module.bias, 0) - - self.apply(_basic_init) - - if self.use_pe: - # Initialize (and freeze) pos_embed by sin-cos embedding: - pos_embed = get_2d_sincos_pos_embed( - self.pos_embed.shape[-1], - int(self.x_embedder.num_patches**0.5), - pe_interpolation=self.pe_interpolation, - base_size=self.base_size, - ) - self.pos_embed.data.copy_(torch.from_numpy(pos_embed).float().unsqueeze(0)) - - # Initialize patch_embed like nn.Linear (instead of nn.Conv2d): - w = self.x_embedder.proj.weight.data - nn.init.xavier_uniform_(w.view([w.shape[0], -1])) - - # Initialize timestep embedding MLP: - nn.init.normal_(self.t_embedder.mlp[0].weight, std=0.02) - nn.init.normal_(self.t_embedder.mlp[2].weight, std=0.02) - nn.init.normal_(self.t_block[1].weight, std=0.02) - - # Initialize caption embedding MLP: - nn.init.normal_(self.y_embedder.y_proj.fc1.weight, std=0.02) - nn.init.normal_(self.y_embedder.y_proj.fc2.weight, std=0.02) - def get_2d_sincos_pos_embed(embed_dim, grid_size, cls_token=False, extra_tokens=0, pe_interpolation=1.0, base_size=16): """ diff --git a/Sana/models/sana_blocks.py b/Sana/models/sana_blocks.py index cdd37e4..894b23c 100644 --- a/Sana/models/sana_blocks.py +++ b/Sana/models/sana_blocks.py @@ -22,12 +22,13 @@ import torch.nn as nn import torch.nn.functional as F from einops import rearrange -from timm.models.vision_transformer import Attention as Attention_ -from timm.models.vision_transformer import Mlp from transformers import AutoModelForCausalLM from .norms import RMSNorm from .utils import get_same_padding, to_2tuple +from .basic_modules import Mlp + +import comfy.ldm.common_dit sdpa_32b = None Q_4GB_LIMIT = 32000000 @@ -38,11 +39,14 @@ from comfy import model_management if model_management.xformers_enabled(): - import xformers import xformers.ops + if int((xformers.__version__).split(".")[2]) >= 28: + block_diagonal_mask_from_seqlens = xformers.ops.fmha.attn_bias.BlockDiagonalMask.from_seqlens + else: + block_diagonal_mask_from_seqlens = xformers.ops.fmha.BlockDiagonalMask.from_seqlens else: if model_management.xpu_available: - import intel_extension_for_pytorch as ipex + import intel_extension_for_pytorch as ipex # type: ignore import os if not torch.xpu.has_fp64_dtype() and not os.environ.get('IPEX_FORCE_ATTENTION_SLICE', None): from ...utils.IPEX.attention import scaled_dot_product_attention_32_bit @@ -61,7 +65,7 @@ def t2i_modulate(x, shift, scale): class MultiHeadCrossAttention(nn.Module): - def __init__(self, d_model, num_heads, attn_drop=0.0, proj_drop=0.0, qk_norm=False, **block_kwargs): + def __init__(self, d_model, num_heads, attn_drop=0.0, proj_drop=0.0, qk_norm=False, dtype=None, device=None, operations=None, **block_kwargs): super().__init__() assert d_model % num_heads == 0, "d_model must be divisible by num_heads" @@ -69,10 +73,10 @@ def __init__(self, d_model, num_heads, attn_drop=0.0, proj_drop=0.0, qk_norm=Fal self.num_heads = num_heads self.head_dim = d_model // num_heads - self.q_linear = nn.Linear(d_model, d_model) - self.kv_linear = nn.Linear(d_model, d_model * 2) + self.q_linear = operations.Linear(d_model, d_model, dtype=dtype, device=device) + self.kv_linear = operations.Linear(d_model, d_model * 2, dtype=dtype, device=device) self.attn_drop = nn.Dropout(attn_drop) - self.proj = nn.Linear(d_model, d_model) + self.proj = operations.Linear(d_model, d_model, dtype=dtype, device=device) self.proj_drop = nn.Dropout(proj_drop) if qk_norm: # not used for now @@ -93,7 +97,7 @@ def forward(self, x, cond, mask=None): if model_management.xformers_enabled(): attn_bias = None if mask is not None: - attn_bias = xformers.ops.fmha.BlockDiagonalMask.from_seqlens([N] * B, mask) + attn_bias = block_diagonal_mask_from_seqlens([N] * B, mask) x = xformers.ops.memory_efficient_attention( q, k, v, p=self.attn_drop.p, @@ -135,7 +139,7 @@ def forward(self, x, cond, mask=None): return x -class LiteLA(Attention_): +class LiteLA(torch.nn.Module): # from attention r"""Lightweight linear attention""" PAD_VAL = 1 @@ -151,9 +155,20 @@ def __init__( use_bias=False, qk_norm=False, norm_eps=1e-5, + dtype=None, + device=None, + operations=None, ): + super().__init__() heads = heads or int(out_dim // dim * heads_ratio) - super().__init__(in_dim, num_heads=heads, qkv_bias=use_bias) + + # assert dim % heads == 0, 'dim should be divisible by num_heads' + self.num_heads = heads + self.head_dim = in_dim // heads + self.scale = self.head_dim ** -0.5 + + self.qkv = operations.Linear(in_dim, in_dim * 3, bias=use_bias, dtype=dtype, device=device) + self.proj = operations.Linear(in_dim, in_dim, dtype=dtype, device=device) self.in_dim = in_dim self.out_dim = out_dim @@ -346,7 +361,7 @@ def __call__(self, x: torch.Tensor, mask=None, HW=None, block_id=None) -> torch. return out -class FlashAttention(Attention_): +class FlashAttention(torch.nn.Module): # from attention """Multi-head Flash Attention block with qk norm.""" def __init__( @@ -395,7 +410,7 @@ def forward(self, x, mask=None, HW=None, block_id=None): attn_bias = torch.zeros([B * self.num_heads, q.shape[1], k.shape[1]], dtype=q.dtype, device=q.device) attn_bias.masked_fill_(mask.squeeze(1).repeat(self.num_heads, 1, 1) == 0, float("-inf")) - if _xformers_available: + if model_management.xformers_enabled(): x = xformers.ops.memory_efficient_attention(q, k, v, p=self.attn_drop.p, attn_bias=attn_bias) else: q, k, v = q.transpose(1, 2), k.transpose(1, 2), v.transpose(1, 2) @@ -418,7 +433,31 @@ def forward(self, x, mask=None, HW=None, block_id=None): ################################################################################# # AMP attention with fp32 softmax to fix loss NaN problem during training # ################################################################################# -class Attention(Attention_): +class Attention(torch.nn.Module): + def __init__( + self, + dim, + num_heads=8, + qkv_bias=True, + sampling='conv', + sr_ratio=1, + qk_norm=False, + dtype=None, + device=None, + operations=None, + **block_kwargs, + ): + super().__init__() + assert dim % num_heads == 0, 'dim should be divisible by num_heads' + self.num_heads = num_heads + self.head_dim = dim // num_heads + self.scale = self.head_dim ** -0.5 + + self.qkv = operations.Linear(dim, dim * 3, bias=qkv_bias, dtype=dtype, device=device) + self.q_norm = nn.Identity() + self.k_norm = nn.Identity() + self.proj = operations.Linear(dim, dim, dtype=dtype, device=device) + def forward(self, x, HW=None): B, N, C = x.shape qkv = self.qkv(x).reshape(B, N, 3, self.num_heads, C // self.num_heads).permute(2, 0, 3, 1, 4) @@ -432,11 +471,11 @@ def forward(self, x, HW=None): attn = (q @ k.transpose(-2, -1)) * self.scale attn = attn.softmax(dim=-1) - attn = self.attn_drop(attn) + #attn = self.attn_drop(attn) x = (attn @ v).transpose(1, 2).reshape(B, N, C) x = self.proj(x) - x = self.proj_drop(x) + #x = self.proj_drop(x) return x @@ -445,11 +484,14 @@ class FinalLayer(nn.Module): The final layer of Sana. """ - def __init__(self, hidden_size, patch_size, out_channels): + def __init__(self, hidden_size, patch_size, out_channels, dtype=None, device=None, operations=None): super().__init__() - self.norm_final = nn.LayerNorm(hidden_size, elementwise_affine=False, eps=1e-6) - self.linear = nn.Linear(hidden_size, patch_size * patch_size * out_channels, bias=True) - self.adaLN_modulation = nn.Sequential(nn.SiLU(), nn.Linear(hidden_size, 2 * hidden_size, bias=True)) + self.norm_final = operations.LayerNorm(hidden_size, elementwise_affine=False, eps=1e-6, dtype=dtype, device=device) + self.linear = operations.Linear(hidden_size, patch_size * patch_size * out_channels, bias=True, dtype=dtype, device=device) + self.adaLN_modulation = nn.Sequential( + nn.SiLU(), + operations.Linear(hidden_size, 2 * hidden_size, bias=True, dtype=dtype, device=device) + ) def forward(self, x, c): shift, scale = self.adaLN_modulation(c).chunk(2, dim=1) @@ -463,17 +505,18 @@ class T2IFinalLayer(nn.Module): The final layer of Sana. """ - def __init__(self, hidden_size, patch_size, out_channels): + def __init__(self, hidden_size, patch_size, out_channels, dtype=None, device=None, operations=None): super().__init__() - self.norm_final = nn.LayerNorm(hidden_size, elementwise_affine=False, eps=1e-6) - self.linear = nn.Linear(hidden_size, patch_size * patch_size * out_channels, bias=True) - self.scale_shift_table = nn.Parameter(torch.randn(2, hidden_size) / hidden_size**0.5) + self.norm_final = operations.LayerNorm(hidden_size, elementwise_affine=False, eps=1e-6, dtype=dtype, device=device) + self.linear = operations.Linear(hidden_size, patch_size * patch_size * out_channels, bias=True, dtype=dtype, device=device) + self.scale_shift_table = nn.Parameter(torch.randn(2, hidden_size) / hidden_size ** 0.5) self.out_channels = out_channels def forward(self, x, t): + dtype = x.dtype shift, scale = (self.scale_shift_table[None] + t[:, None]).chunk(2, dim=1) x = t2i_modulate(self.norm_final(x), shift, scale) - x = self.linear(x) + x = self.linear(x.to(dtype)) return x @@ -482,12 +525,14 @@ class MaskFinalLayer(nn.Module): The final layer of Sana. """ - def __init__(self, final_hidden_size, c_emb_size, patch_size, out_channels): + def __init__(self, final_hidden_size, c_emb_size, patch_size, out_channels, dtype=None, device=None, operations=None): super().__init__() - self.norm_final = nn.LayerNorm(final_hidden_size, elementwise_affine=False, eps=1e-6) - self.linear = nn.Linear(final_hidden_size, patch_size * patch_size * out_channels, bias=True) - self.adaLN_modulation = nn.Sequential(nn.SiLU(), nn.Linear(c_emb_size, 2 * final_hidden_size, bias=True)) - + self.norm_final = operations.LayerNorm(final_hidden_size, elementwise_affine=False, eps=1e-6, dtype=dtype, device=device) + self.linear = operations.Linear(final_hidden_size, patch_size * patch_size * out_channels, bias=True, dtype=dtype, device=device) + self.adaLN_modulation = nn.Sequential( + nn.SiLU(), + operations.Linear(c_emb_size, 2 * final_hidden_size, bias=True, dtype=dtype, device=device) + ) def forward(self, x, t): shift, scale = self.adaLN_modulation(t).chunk(2, dim=1) x = modulate(self.norm_final(x), shift, scale) @@ -500,12 +545,14 @@ class DecoderLayer(nn.Module): The final layer of Sana. """ - def __init__(self, hidden_size, decoder_hidden_size): + def __init__(self, hidden_size, decoder_hidden_size, dtype=None, device=None, operations=None): super().__init__() - self.norm_decoder = nn.LayerNorm(hidden_size, elementwise_affine=False, eps=1e-6) - self.linear = nn.Linear(hidden_size, decoder_hidden_size, bias=True) - self.adaLN_modulation = nn.Sequential(nn.SiLU(), nn.Linear(hidden_size, 2 * hidden_size, bias=True)) - + self.norm_decoder = operations.LayerNorm(hidden_size, elementwise_affine=False, eps=1e-6, dtype=dtype, device=device) + self.linear = operations.Linear(hidden_size, decoder_hidden_size, bias=True, dtype=dtype, device=device) + self.adaLN_modulation = nn.Sequential( + nn.SiLU(), + operations.Linear(hidden_size, 2 * hidden_size, bias=True, dtype=dtype, device=device) + ) def forward(self, x, t): shift, scale = self.adaLN_modulation(t).chunk(2, dim=1) x = modulate(self.norm_decoder(x), shift, scale) @@ -521,12 +568,12 @@ class TimestepEmbedder(nn.Module): Embeds scalar timesteps into vector representations. """ - def __init__(self, hidden_size, frequency_embedding_size=256): + def __init__(self, hidden_size, frequency_embedding_size=256, dtype=None, device=None, operations=None): super().__init__() self.mlp = nn.Sequential( - nn.Linear(frequency_embedding_size, hidden_size, bias=True), + operations.Linear(frequency_embedding_size, hidden_size, bias=True, dtype=dtype, device=device), nn.SiLU(), - nn.Linear(hidden_size, hidden_size, bias=True), + operations.Linear(hidden_size, hidden_size, bias=True, dtype=dtype, device=device), ) self.frequency_embedding_size = frequency_embedding_size @@ -551,9 +598,9 @@ def timestep_embedding(t, dim, max_period=10000): embedding = torch.cat([embedding, torch.zeros_like(embedding[:, :1])], dim=-1) return embedding - def forward(self, t): - t_freq = self.timestep_embedding(t, self.frequency_embedding_size).to(self.dtype) - t_emb = self.mlp(t_freq) + def forward(self, t, dtype): + t_freq = self.timestep_embedding(t, self.frequency_embedding_size) + t_emb = self.mlp(t_freq.to(dtype)) return t_emb @property @@ -644,10 +691,14 @@ def __init__( uncond_prob, act_layer=nn.GELU(approximate="tanh"), token_num=120, + dtype=None, + device=None, + operations=None, ): super().__init__() self.y_proj = Mlp( - in_features=in_channels, hidden_features=hidden_size, out_features=hidden_size, act_layer=act_layer, drop=0 + in_features=in_channels, hidden_features=hidden_size, out_features=hidden_size, act_layer=act_layer, drop=0, + dtype=dtype, device=device, operations=operations ) self.register_buffer("y_embedding", nn.Parameter(torch.randn(token_num, in_channels) / in_channels**0.5)) self.uncond_prob = uncond_prob @@ -734,30 +785,46 @@ def __init__( norm_layer=None, flatten=True, bias=True, + dynamic_img_pad=True, + padding_mode='circular', + dtype=None, + device=None, + operations=None, ): super().__init__() kernel_size = kernel_size or patch_size img_size = to_2tuple(img_size) patch_size = to_2tuple(patch_size) - self.img_size = img_size self.patch_size = patch_size - self.grid_size = (img_size[0] // patch_size[0], img_size[1] // patch_size[1]) - self.num_patches = self.grid_size[0] * self.grid_size[1] + + if img_size is not None and False: + self.img_size = img_size + self.grid_size = (img_size[0] // patch_size[0], img_size[1] // patch_size[1]) + self.num_patches = self.grid_size[0] * self.grid_size[1] + else: + self.img_size = None + self.grid_size = None + self.num_patches = None + self.flatten = flatten + self.dynamic_img_pad = dynamic_img_pad + self.padding_mode = padding_mode if not padding and kernel_size % 2 > 0: padding = get_same_padding(kernel_size) - self.proj = nn.Conv2d( - in_chans, embed_dim, kernel_size=kernel_size, stride=patch_size, padding=padding, bias=bias + self.proj = operations.Conv2d( + in_chans, embed_dim, kernel_size=kernel_size, stride=patch_size, padding=padding, bias=bias, dtype=dtype, device=device ) self.norm = norm_layer(embed_dim) if norm_layer else nn.Identity() def forward(self, x): B, C, H, W = x.shape - assert (H == self.img_size[0], f"Input image height ({H}) doesn't match model ({self.img_size[0]}).") - assert (W == self.img_size[1], f"Input image width ({W}) doesn't match model ({self.img_size[1]}).") + # assert (H == self.img_size[0], f"Input image height ({H}) doesn't match model ({self.img_size[0]}).") + # assert (W == self.img_size[1], f"Input image width ({W}) doesn't match model ({self.img_size[1]}).") + if self.dynamic_img_pad: + x = comfy.ldm.common_dit.pad_to_patch_size(x, self.patch_size, padding_mode=self.padding_mode) x = self.proj(x) if self.flatten: - x = x.flatten(2).transpose(1, 2) # BCHW -> BNC + x = x.flatten(2).transpose(1, 2) # BCHW -> BNC x = self.norm(x) return x @@ -775,20 +842,29 @@ def __init__( norm_layer=None, flatten=True, bias=True, + dynamic_img_pad=True, + padding_mode='circular', + dtype=None, + device=None, + operations=None, ): super().__init__() kernel_size = kernel_size or patch_size patch_size = to_2tuple(patch_size) self.patch_size = patch_size self.flatten = flatten + self.dynamic_img_pad = dynamic_img_pad + self.padding_mode = padding_mode if not padding and kernel_size % 2 > 0: padding = get_same_padding(kernel_size) - self.proj = nn.Conv2d( - in_chans, embed_dim, kernel_size=kernel_size, stride=patch_size, padding=padding, bias=bias + self.proj = operations.Conv2d( + in_chans, embed_dim, kernel_size=kernel_size, stride=patch_size, padding=padding, bias=bias, dtype=dtype, device=device ) self.norm = norm_layer(embed_dim) if norm_layer else nn.Identity() def forward(self, x): + if self.dynamic_img_pad: + x = comfy.ldm.common_dit.pad_to_patch_size(x, self.patch_size, padding_mode=self.padding_mode) x = self.proj(x) if self.flatten: x = x.flatten(2).transpose(1, 2) # BCHW -> BNC diff --git a/Sana/models/sana_multi_scale.py b/Sana/models/sana_multi_scale.py index 2d5452c..bd5bab8 100644 --- a/Sana/models/sana_multi_scale.py +++ b/Sana/models/sana_multi_scale.py @@ -17,13 +17,13 @@ # This file is modified from https://github.com/PixArt-alpha/PixArt-sigma import torch import torch.nn as nn -from timm.models.layers import DropPath from .basic_modules import DWMlp, GLUMBConv, MBConvPreGLU, Mlp from .sana import Sana, get_2d_sincos_pos_embed from .sana_blocks import ( Attention, CaptionEmbedder, + TimestepEmbedder, FlashAttention, LiteLA, MultiHeadCrossAttention, @@ -31,8 +31,7 @@ T2IFinalLayer, t2i_modulate, ) -from .utils import auto_grad_checkpoint - +from .norms import RMSNorm class SanaMSBlock(nn.Module): """ @@ -52,11 +51,14 @@ def __init__( mlp_acts=("silu", "silu", None), linear_head_dim=32, cross_norm=False, + dtype=None, + device=None, + operations=None, **block_kwargs, ): super().__init__() self.hidden_size = hidden_size - self.norm1 = nn.LayerNorm(hidden_size, elementwise_affine=False, eps=1e-6) + self.norm1 = operations.LayerNorm(hidden_size, elementwise_affine=False, eps=1e-6, dtype=dtype, device=device) if attn_type == "flash": # flash self attention self.attn = FlashAttention( @@ -64,25 +66,34 @@ def __init__( num_heads=num_heads, qkv_bias=True, qk_norm=qk_norm, + dtype=dtype, + device=device, + operations=operations, **block_kwargs, ) elif attn_type == "linear": # linear self attention # TODO: Here the num_heads set to 36 for tmp used self_num_heads = hidden_size // linear_head_dim - self.attn = LiteLA(hidden_size, hidden_size, heads=self_num_heads, eps=1e-8, qk_norm=qk_norm) + self.attn = LiteLA( + hidden_size, hidden_size, heads=self_num_heads, eps=1e-8, qk_norm=qk_norm, + dtype=dtype, device=device, operations=operations, + ) elif attn_type == "vanilla": # vanilla self attention - self.attn = Attention(hidden_size, num_heads=num_heads, qkv_bias=True) + self.attn = Attention( + hidden_size, num_heads=num_heads, qkv_bias=True, dtype=dtype, device=device, operations=operations, + ) else: raise ValueError(f"{attn_type} type is not defined.") - self.cross_attn = MultiHeadCrossAttention(hidden_size, num_heads, qk_norm=cross_norm, **block_kwargs) + self.cross_attn = MultiHeadCrossAttention(hidden_size, num_heads, qk_norm=cross_norm, dtype=dtype, device=device, operations=operations, **block_kwargs) self.norm2 = nn.LayerNorm(hidden_size, elementwise_affine=False, eps=1e-6) if ffn_type == "dwmlp": approx_gelu = lambda: nn.GELU(approximate="tanh") self.mlp = DWMlp( - in_features=hidden_size, hidden_features=int(hidden_size * mlp_ratio), act_layer=approx_gelu, drop=0 + in_features=hidden_size, hidden_features=int(hidden_size * mlp_ratio), act_layer=approx_gelu, drop=0, + dtype=dtype, device=device, operations=operations, ) elif ffn_type == "glumbconv": self.mlp = GLUMBConv( @@ -91,6 +102,9 @@ def __init__( use_bias=(True, True, False), norm=(None, None, None), act=mlp_acts, + dtype=dtype, + device=device, + operations=operations, ) elif ffn_type == "glumbconv_dilate": self.mlp = GLUMBConv( @@ -100,11 +114,15 @@ def __init__( norm=(None, None, None), act=mlp_acts, dilation=2, + dtype=dtype, + device=device, + operations=operations, ) elif ffn_type == "mlp": approx_gelu = lambda: nn.GELU(approximate="tanh") self.mlp = Mlp( - in_features=hidden_size, hidden_features=int(hidden_size * mlp_ratio), act_layer=approx_gelu, drop=0 + in_features=hidden_size, hidden_features=int(hidden_size * mlp_ratio), act_layer=approx_gelu, drop=0, + dtype=dtype, device=device, operations=operations, ) elif ffn_type == "mbconvpreglu": self.mlp = MBConvPreGLU( @@ -114,17 +132,20 @@ def __init__( use_bias=(True, True, False), norm=None, act=mlp_acts, + dtype=dtype, + device=device, + operations=operations, ) else: raise ValueError(f"{ffn_type} type is not defined.") - self.drop_path = DropPath(drop_path) if drop_path > 0.0 else nn.Identity() + self.drop_path = nn.Identity() # DropPath(drop_path) if drop_path > 0.0 else nn.Identity() self.scale_shift_table = nn.Parameter(torch.randn(6, hidden_size) / hidden_size**0.5) def forward(self, x, y, t, mask=None, HW=None, **kwargs): B, N, C = x.shape shift_msa, scale_msa, gate_msa, shift_mlp, scale_mlp, gate_mlp = ( - self.scale_shift_table[None] + t.reshape(B, 6, -1) + self.scale_shift_table[None].to(x.dtype) + t.reshape(B, 6, -1) ).chunk(6, dim=1) x = x + self.drop_path(gate_msa * self.attn(t2i_modulate(self.norm1(x), shift_msa, scale_msa), HW=HW)) x = x + self.cross_attn(x, y, mask) @@ -169,51 +190,57 @@ def __init__( mlp_acts=("silu", "silu", None), linear_head_dim=32, cross_norm=False, + dtype=None, + device=None, + operations=None, **kwargs, ): - super().__init__( - input_size=input_size, - patch_size=patch_size, - in_channels=in_channels, - hidden_size=hidden_size, - depth=depth, - num_heads=num_heads, - mlp_ratio=mlp_ratio, - class_dropout_prob=class_dropout_prob, - learn_sigma=learn_sigma, - pred_sigma=pred_sigma, - drop_path=drop_path, - caption_channels=caption_channels, - pe_interpolation=pe_interpolation, - config=config, - model_max_length=model_max_length, - qk_norm=qk_norm, - y_norm=y_norm, - norm_eps=norm_eps, - attn_type=attn_type, - ffn_type=ffn_type, - use_pe=use_pe, - y_norm_scale_factor=y_norm_scale_factor, - patch_embed_kernel=patch_embed_kernel, - mlp_acts=mlp_acts, - linear_head_dim=linear_head_dim, - **kwargs, - ) - self.dtype = torch.get_default_dtype() + nn.Module.__init__(self) + self.dtype = dtype + self.pred_sigma = pred_sigma + self.in_channels = in_channels + self.out_channels = in_channels * 2 if pred_sigma else in_channels + self.patch_size = patch_size + self.num_heads = num_heads + self.pe_interpolation = pe_interpolation + self.depth = depth + self.use_pe = use_pe + self.y_norm = y_norm + self.model_max_length = model_max_length + self.fp32_attention = kwargs.get("use_fp32_attention", False) self.h = self.w = 0 + approx_gelu = lambda: nn.GELU(approximate="tanh") - self.t_block = nn.Sequential(nn.SiLU(), nn.Linear(hidden_size, 6 * hidden_size, bias=True)) + self.t_block = nn.Sequential( + nn.SiLU(), operations.Linear(hidden_size, 6 * hidden_size, bias=True, dtype=dtype, device=device) + ) + + self.t_embedder = TimestepEmbedder(hidden_size, dtype=dtype, device=device, operations=operations) + self.pos_embed_ms = None + if input_size is not None: + self.base_size = input_size // self.patch_size + else: + self.base_size = None kernel_size = patch_embed_kernel or patch_size - self.x_embedder = PatchEmbedMS(patch_size, in_channels, hidden_size, kernel_size=kernel_size, bias=True) + self.x_embedder = PatchEmbedMS( + patch_size, in_channels, hidden_size, kernel_size=kernel_size, bias=True, + dtype=dtype, device=device, operations=operations, + ) self.y_embedder = CaptionEmbedder( in_channels=caption_channels, hidden_size=hidden_size, uncond_prob=class_dropout_prob, act_layer=approx_gelu, token_num=model_max_length, + dtype=dtype, + device=device, + operations=operations, ) + if self.y_norm: + self.attention_y_norm = RMSNorm(hidden_size, scale_factor=y_norm_scale_factor, eps=norm_eps) + drop_path = [x.item() for x in torch.linspace(0, drop_path, depth)] # stochastic depth decay rule self.blocks = nn.ModuleList( [ @@ -229,13 +256,16 @@ def __init__( mlp_acts=mlp_acts, linear_head_dim=linear_head_dim, cross_norm=cross_norm, + dtype=dtype, + device=device, + operations=operations, ) for i in range(depth) ] ) - self.final_layer = T2IFinalLayer(hidden_size, patch_size, self.out_channels) - - self.initialize() + self.final_layer = T2IFinalLayer( + hidden_size, patch_size, self.out_channels, dtype=dtype, device=device, operations=operations + ) def forward(self, x, timesteps, context, **kwargs): """ @@ -251,7 +281,7 @@ def forward(self, x, timesteps, context, **kwargs): context = context.unsqueeze(1) ## run original forward pass - out = self.forward_raw( + out = self.forward_orig( x = x.to(self.dtype), timestep = timesteps.to(self.dtype), y = context.to(self.dtype), @@ -262,7 +292,7 @@ def forward(self, x, timesteps, context, **kwargs): return out - def forward_raw(self, x, timestep, y, mask=None, data_info=None, **kwargs): + def forward(self, x, timestep, context, mask=None, data_info=None, **kwargs): """ Forward pass of Sana. x: (N, C, H, W) tensor of spatial inputs (images or latent representations of images) @@ -270,9 +300,9 @@ def forward_raw(self, x, timestep, y, mask=None, data_info=None, **kwargs): y: (N, 1, 120, C) tensor of class labels """ bs = x.shape[0] - x = x.to(self.dtype) - timestep = timestep.to(self.dtype) - y = y.to(self.dtype) + y = context + if len(y.shape) == 3: + y = y.unsqueeze(1) self.h, self.w = x.shape[-2] // self.patch_size, x.shape[-1] // self.patch_size if self.use_pe: x = self.x_embedder(x) @@ -285,16 +315,13 @@ def forward_raw(self, x, timestep, y, mask=None, data_info=None, **kwargs): pe_interpolation=self.pe_interpolation, base_size=self.base_size, ) - ) - .unsqueeze(0) - .to(x.device) - .to(self.dtype) + ).unsqueeze(0).to(x.device).to(x.dtype) ) x += self.pos_embed_ms # (N, T, D), where T = H * W / patch_size ** 2 else: x = self.x_embedder(x) - t = self.t_embedder(timestep) # (N, D) + t = self.t_embedder(timestep, x.dtype) # (N, D) y_lens = ((y != 0).sum(dim=3) > 0).sum(dim=2).squeeze().tolist() y_lens = [y_lens[1]] * bs @@ -311,9 +338,7 @@ def forward_raw(self, x, timestep, y, mask=None, data_info=None, **kwargs): y = y.squeeze(1).masked_select(mask.unsqueeze(-1).bool()).view(1, -1, y.shape[-1]) for block in self.blocks: - x = auto_grad_checkpoint( - block, x, y, t0, y_lens, (self.h, self.w), **kwargs - ) # (N, T, D) #support grad checkpoint + x = block(x, y, t0, y_lens, (self.h, self.w), **kwargs) # (N, T, D) # x = self.final_layer(x, t) # (N, T, patch_size ** 2 * out_channels) x = self.unpatchify(x) # (N, out_channels, H, W) @@ -348,26 +373,3 @@ def unpatchify(self, x): x = torch.einsum("nhwpqc->nchpwq", x) imgs = x.reshape(shape=(x.shape[0], c, self.h * p, self.w * p)) return imgs - - def initialize(self): - # Initialize transformer layers: - def _basic_init(module): - if isinstance(module, nn.Linear): - torch.nn.init.xavier_uniform_(module.weight) - if module.bias is not None: - nn.init.constant_(module.bias, 0) - - self.apply(_basic_init) - - # Initialize patch_embed like nn.Linear (instead of nn.Conv2d): - w = self.x_embedder.proj.weight.data - nn.init.xavier_uniform_(w.view([w.shape[0], -1])) - - # Initialize timestep embedding MLP: - nn.init.normal_(self.t_embedder.mlp[0].weight, std=0.02) - nn.init.normal_(self.t_embedder.mlp[2].weight, std=0.02) - nn.init.normal_(self.t_block[1].weight, std=0.02) - - # Initialize caption embedding MLP: - nn.init.normal_(self.y_embedder.y_proj.fc1.weight, std=0.02) - nn.init.normal_(self.y_embedder.y_proj.fc2.weight, std=0.02) diff --git a/Sana/nodes.py b/Sana/nodes.py index abce9ba..ca1628b 100644 --- a/Sana/nodes.py +++ b/Sana/nodes.py @@ -2,151 +2,66 @@ import folder_paths from nodes import EmptyLatentImage -from .conf import sana_conf, sana_res -from .loader import load_sana - -dtypes = [ - "auto", - "FP32", - "FP16", - "BF16" -] - -class SanaCheckpointLoader: - @classmethod - def INPUT_TYPES(s): - return { - "required": { - "ckpt_name": (folder_paths.get_filename_list("checkpoints"),), - "model": (list(sana_conf.keys()),), - } - } - RETURN_TYPES = ("MODEL",) - RETURN_NAMES = ("model",) - FUNCTION = "load_checkpoint" - CATEGORY = "ExtraModels/Sana" - TITLE = "Sana Checkpoint Loader" - - def load_checkpoint(self, ckpt_name, model): - ckpt_path = folder_paths.get_full_path("checkpoints", ckpt_name) - model_conf = sana_conf[model] - model = load_sana( - model_path = ckpt_path, - model_conf = model_conf, - ) - return (model,) - - class EmptySanaLatentImage(EmptyLatentImage): - CATEGORY = "ExtraModels/Sana" - TITLE = "Empty Sana Latent Image" - - def generate(self, width, height, batch_size=1): - latent = torch.zeros([batch_size, 32, height // 32, width // 32], device=self.device) - return ({"samples":latent}, ) - - -class SanaResolutionSelect(): - @classmethod - def INPUT_TYPES(s): - return { - "required": { - "model": (list(sana_res.keys()),), - "ratio": (list(sana_res["1024px"].keys()),{"default":"1.00"}), - } - } - RETURN_TYPES = ("INT","INT") - RETURN_NAMES = ("width","height") - FUNCTION = "get_res" - CATEGORY = "ExtraModels/Sana" - TITLE = "Sana Resolution Select" - - def get_res(self, model, ratio): - width, height = sana_res[model][ratio] - return (width,height) - - -class SanaResolutionCond: - @classmethod - def INPUT_TYPES(s): - return { - "required": { - "cond": ("CONDITIONING", ), - "width": ("INT", {"default": 1024.0, "min": 0, "max": 8192}), - "height": ("INT", {"default": 1024.0, "min": 0, "max": 8192}), - } - } - - RETURN_TYPES = ("CONDITIONING",) - RETURN_NAMES = ("cond",) - FUNCTION = "add_cond" - CATEGORY = "ExtraModels/Sana" - TITLE = "Sana Resolution Conditioning" - - def add_cond(self, cond, width, height): - for c in range(len(cond)): - cond[c][1].update({ - "img_hw": [[height, width]], - "aspect_ratio": [[height/width]], - }) - return (cond,) + CATEGORY = "ExtraModels/Sana" + TITLE = "Empty Sana Latent Image" + def generate(self, width, height, batch_size=1): + latent = torch.zeros([batch_size, 32, height // 32, width // 32], device=self.device) + return ({"samples":latent}, ) class SanaTextEncode: - @classmethod - def INPUT_TYPES(s): - return { - "required": { - "text": ("STRING", {"multiline": True}), - "GEMMA": ("GEMMA",), - } - } - - RETURN_TYPES = ("CONDITIONING",) - FUNCTION = "encode" - CATEGORY = "ExtraModels/Sana" - TITLE = "Sana Text Encode" - - def encode(self, text, GEMMA=None): - tokenizer = GEMMA["tokenizer"] - text_encoder = GEMMA["text_encoder"] - - with torch.no_grad(): - chi_prompt = "\n".join(preset_te_prompt) - full_prompt = chi_prompt + text - num_chi_tokens = len(tokenizer.encode(chi_prompt)) - max_length = num_chi_tokens + 300 - 2 - - tokens = tokenizer( - [full_prompt], - max_length=max_length, - padding="max_length", - truncation=True, - return_tensors="pt" - ).to(text_encoder.device) - - select_idx = [0] + list(range(-300 + 1, 0)) - embs = text_encoder(tokens.input_ids, tokens.attention_mask)[0][:, None][:, :, select_idx] - emb_masks = tokens.attention_mask[:, select_idx] - embs = embs * emb_masks.unsqueeze(-1) - - return ([[embs, {}]], ) + @classmethod + def INPUT_TYPES(s): + return { + "required": { + "text": ("STRING", {"multiline": True}), + "GEMMA": ("GEMMA",), + } + } + + RETURN_TYPES = ("CONDITIONING",) + FUNCTION = "encode" + CATEGORY = "ExtraModels/Sana" + TITLE = "Sana Text Encode" + + def encode(self, text, GEMMA=None): + tokenizer = GEMMA["tokenizer"] + text_encoder = GEMMA["text_encoder"] + + with torch.no_grad(): + chi_prompt = "\n".join(preset_te_prompt) + full_prompt = chi_prompt + text + num_chi_tokens = len(tokenizer.encode(chi_prompt)) + max_length = num_chi_tokens + 300 - 2 + + tokens = tokenizer( + [full_prompt], + max_length=max_length, + padding="max_length", + truncation=True, + return_tensors="pt" + ).to(text_encoder.device) + + select_idx = [0] + list(range(-300 + 1, 0)) + embs = text_encoder(tokens.input_ids, tokens.attention_mask)[0][:, None][:, :, select_idx] + emb_masks = tokens.attention_mask[:, select_idx] + embs = embs * emb_masks.unsqueeze(-1) + + return ([[embs, {}]], ) preset_te_prompt = [ - 'Given a user prompt, generate an "Enhanced prompt" that provides detailed visual descriptions suitable for image generation. Evaluate the level of detail in the user prompt:', - '- If the prompt is simple, focus on adding specifics about colors, shapes, sizes, textures, and spatial relationships to create vivid and concrete scenes.', - '- If the prompt is already detailed, refine and enhance the existing details slightly without overcomplicating.', - 'Here are examples of how to transform or refine prompts:', - '- User Prompt: A cat sleeping -> Enhanced: A small, fluffy white cat curled up in a round shape, sleeping peacefully on a warm sunny windowsill, surrounded by pots of blooming red flowers.', - '- User Prompt: A busy city street -> Enhanced: A bustling city street scene at dusk, featuring glowing street lamps, a diverse crowd of people in colorful clothing, and a double-decker bus passing by towering glass skyscrapers.', - 'Please generate only the enhanced description for the prompt below and avoid including any additional commentary or evaluations:', - 'User Prompt: ' + 'Given a user prompt, generate an "Enhanced prompt" that provides detailed visual descriptions suitable for image generation. Evaluate the level of detail in the user prompt:', + '- If the prompt is simple, focus on adding specifics about colors, shapes, sizes, textures, and spatial relationships to create vivid and concrete scenes.', + '- If the prompt is already detailed, refine and enhance the existing details slightly without overcomplicating.', + 'Here are examples of how to transform or refine prompts:', + '- User Prompt: A cat sleeping -> Enhanced: A small, fluffy white cat curled up in a round shape, sleeping peacefully on a warm sunny windowsill, surrounded by pots of blooming red flowers.', + '- User Prompt: A busy city street -> Enhanced: A bustling city street scene at dusk, featuring glowing street lamps, a diverse crowd of people in colorful clothing, and a double-decker bus passing by towering glass skyscrapers.', + 'Please generate only the enhanced description for the prompt below and avoid including any additional commentary or evaluations:', + 'User Prompt: ' ] NODE_CLASS_MAPPINGS = { - "SanaCheckpointLoader" : SanaCheckpointLoader, - "SanaResolutionSelect" : SanaResolutionSelect, - "SanaTextEncode" : SanaTextEncode, - "SanaResolutionCond" : SanaResolutionCond, - "EmptySanaLatentImage": EmptySanaLatentImage, + "SanaTextEncode" : SanaTextEncode, + "EmptySanaLatentImage": EmptySanaLatentImage, } diff --git a/T5/LICENSE-ComfyUI b/T5/LICENSE-ComfyUI deleted file mode 100644 index f288702..0000000 --- a/T5/LICENSE-ComfyUI +++ /dev/null @@ -1,674 +0,0 @@ - GNU GENERAL PUBLIC LICENSE - Version 3, 29 June 2007 - - Copyright (C) 2007 Free Software Foundation, Inc. - Everyone is permitted to copy and distribute verbatim copies - of this license document, but changing it is not allowed. - - Preamble - - The GNU General Public License is a free, copyleft license for -software and other kinds of works. - - The licenses for most software and other practical works are designed -to take away your freedom to share and change the works. By contrast, -the GNU General Public License is intended to guarantee your freedom to -share and change all versions of a program--to make sure it remains free -software for all its users. We, the Free Software Foundation, use the -GNU General Public License for most of our software; it applies also to -any other work released this way by its authors. You can apply it to -your programs, too. - - When we speak of free software, we are referring to freedom, not -price. Our General Public Licenses are designed to make sure that you -have the freedom to distribute copies of free software (and charge for -them if you wish), that you receive source code or can get it if you -want it, that you can change the software or use pieces of it in new -free programs, and that you know you can do these things. - - To protect your rights, we need to prevent others from denying you -these rights or asking you to surrender the rights. Therefore, you have -certain responsibilities if you distribute copies of the software, or if -you modify it: responsibilities to respect the freedom of others. - - For example, if you distribute copies of such a program, whether -gratis or for a fee, you must pass on to the recipients the same -freedoms that you received. You must make sure that they, too, receive -or can get the source code. And you must show them these terms so they -know their rights. - - Developers that use the GNU GPL protect your rights with two steps: -(1) assert copyright on the software, and (2) offer you this License -giving you legal permission to copy, distribute and/or modify it. - - For the developers' and authors' protection, the GPL clearly explains -that there is no warranty for this free software. For both users' and -authors' sake, the GPL requires that modified versions be marked as -changed, so that their problems will not be attributed erroneously to -authors of previous versions. - - Some devices are designed to deny users access to install or run -modified versions of the software inside them, although the manufacturer -can do so. This is fundamentally incompatible with the aim of -protecting users' freedom to change the software. The systematic -pattern of such abuse occurs in the area of products for individuals to -use, which is precisely where it is most unacceptable. Therefore, we -have designed this version of the GPL to prohibit the practice for those -products. If such problems arise substantially in other domains, we -stand ready to extend this provision to those domains in future versions -of the GPL, as needed to protect the freedom of users. - - Finally, every program is threatened constantly by software patents. -States should not allow patents to restrict development and use of -software on general-purpose computers, but in those that do, we wish to -avoid the special danger that patents applied to a free program could -make it effectively proprietary. To prevent this, the GPL assures that -patents cannot be used to render the program non-free. - - The precise terms and conditions for copying, distribution and -modification follow. - - TERMS AND CONDITIONS - - 0. Definitions. - - "This License" refers to version 3 of the GNU General Public License. - - "Copyright" also means copyright-like laws that apply to other kinds of -works, such as semiconductor masks. - - "The Program" refers to any copyrightable work licensed under this -License. Each licensee is addressed as "you". "Licensees" and -"recipients" may be individuals or organizations. - - To "modify" a work means to copy from or adapt all or part of the work -in a fashion requiring copyright permission, other than the making of an -exact copy. The resulting work is called a "modified version" of the -earlier work or a work "based on" the earlier work. - - A "covered work" means either the unmodified Program or a work based -on the Program. - - To "propagate" a work means to do anything with it that, without -permission, would make you directly or secondarily liable for -infringement under applicable copyright law, except executing it on a -computer or modifying a private copy. Propagation includes copying, -distribution (with or without modification), making available to the -public, and in some countries other activities as well. - - To "convey" a work means any kind of propagation that enables other -parties to make or receive copies. Mere interaction with a user through -a computer network, with no transfer of a copy, is not conveying. - - An interactive user interface displays "Appropriate Legal Notices" -to the extent that it includes a convenient and prominently visible -feature that (1) displays an appropriate copyright notice, and (2) -tells the user that there is no warranty for the work (except to the -extent that warranties are provided), that licensees may convey the -work under this License, and how to view a copy of this License. If -the interface presents a list of user commands or options, such as a -menu, a prominent item in the list meets this criterion. - - 1. Source Code. - - The "source code" for a work means the preferred form of the work -for making modifications to it. "Object code" means any non-source -form of a work. - - A "Standard Interface" means an interface that either is an official -standard defined by a recognized standards body, or, in the case of -interfaces specified for a particular programming language, one that -is widely used among developers working in that language. - - The "System Libraries" of an executable work include anything, other -than the work as a whole, that (a) is included in the normal form of -packaging a Major Component, but which is not part of that Major -Component, and (b) serves only to enable use of the work with that -Major Component, or to implement a Standard Interface for which an -implementation is available to the public in source code form. A -"Major Component", in this context, means a major essential component -(kernel, window system, and so on) of the specific operating system -(if any) on which the executable work runs, or a compiler used to -produce the work, or an object code interpreter used to run it. - - The "Corresponding Source" for a work in object code form means all -the source code needed to generate, install, and (for an executable -work) run the object code and to modify the work, including scripts to -control those activities. However, it does not include the work's -System Libraries, or general-purpose tools or generally available free -programs which are used unmodified in performing those activities but -which are not part of the work. For example, Corresponding Source -includes interface definition files associated with source files for -the work, and the source code for shared libraries and dynamically -linked subprograms that the work is specifically designed to require, -such as by intimate data communication or control flow between those -subprograms and other parts of the work. - - The Corresponding Source need not include anything that users -can regenerate automatically from other parts of the Corresponding -Source. - - The Corresponding Source for a work in source code form is that -same work. - - 2. Basic Permissions. - - All rights granted under this License are granted for the term of -copyright on the Program, and are irrevocable provided the stated -conditions are met. This License explicitly affirms your unlimited -permission to run the unmodified Program. The output from running a -covered work is covered by this License only if the output, given its -content, constitutes a covered work. This License acknowledges your -rights of fair use or other equivalent, as provided by copyright law. - - You may make, run and propagate covered works that you do not -convey, without conditions so long as your license otherwise remains -in force. You may convey covered works to others for the sole purpose -of having them make modifications exclusively for you, or provide you -with facilities for running those works, provided that you comply with -the terms of this License in conveying all material for which you do -not control copyright. Those thus making or running the covered works -for you must do so exclusively on your behalf, under your direction -and control, on terms that prohibit them from making any copies of -your copyrighted material outside their relationship with you. - - Conveying under any other circumstances is permitted solely under -the conditions stated below. Sublicensing is not allowed; section 10 -makes it unnecessary. - - 3. Protecting Users' Legal Rights From Anti-Circumvention Law. - - No covered work shall be deemed part of an effective technological -measure under any applicable law fulfilling obligations under article -11 of the WIPO copyright treaty adopted on 20 December 1996, or -similar laws prohibiting or restricting circumvention of such -measures. - - When you convey a covered work, you waive any legal power to forbid -circumvention of technological measures to the extent such circumvention -is effected by exercising rights under this License with respect to -the covered work, and you disclaim any intention to limit operation or -modification of the work as a means of enforcing, against the work's -users, your or third parties' legal rights to forbid circumvention of -technological measures. - - 4. Conveying Verbatim Copies. - - You may convey verbatim copies of the Program's source code as you -receive it, in any medium, provided that you conspicuously and -appropriately publish on each copy an appropriate copyright notice; -keep intact all notices stating that this License and any -non-permissive terms added in accord with section 7 apply to the code; -keep intact all notices of the absence of any warranty; and give all -recipients a copy of this License along with the Program. - - You may charge any price or no price for each copy that you convey, -and you may offer support or warranty protection for a fee. - - 5. Conveying Modified Source Versions. - - You may convey a work based on the Program, or the modifications to -produce it from the Program, in the form of source code under the -terms of section 4, provided that you also meet all of these conditions: - - a) The work must carry prominent notices stating that you modified - it, and giving a relevant date. - - b) The work must carry prominent notices stating that it is - released under this License and any conditions added under section - 7. This requirement modifies the requirement in section 4 to - "keep intact all notices". - - c) You must license the entire work, as a whole, under this - License to anyone who comes into possession of a copy. This - License will therefore apply, along with any applicable section 7 - additional terms, to the whole of the work, and all its parts, - regardless of how they are packaged. This License gives no - permission to license the work in any other way, but it does not - invalidate such permission if you have separately received it. - - d) If the work has interactive user interfaces, each must display - Appropriate Legal Notices; however, if the Program has interactive - interfaces that do not display Appropriate Legal Notices, your - work need not make them do so. - - A compilation of a covered work with other separate and independent -works, which are not by their nature extensions of the covered work, -and which are not combined with it such as to form a larger program, -in or on a volume of a storage or distribution medium, is called an -"aggregate" if the compilation and its resulting copyright are not -used to limit the access or legal rights of the compilation's users -beyond what the individual works permit. Inclusion of a covered work -in an aggregate does not cause this License to apply to the other -parts of the aggregate. - - 6. Conveying Non-Source Forms. - - You may convey a covered work in object code form under the terms -of sections 4 and 5, provided that you also convey the -machine-readable Corresponding Source under the terms of this License, -in one of these ways: - - a) Convey the object code in, or embodied in, a physical product - (including a physical distribution medium), accompanied by the - Corresponding Source fixed on a durable physical medium - customarily used for software interchange. - - b) Convey the object code in, or embodied in, a physical product - (including a physical distribution medium), accompanied by a - written offer, valid for at least three years and valid for as - long as you offer spare parts or customer support for that product - model, to give anyone who possesses the object code either (1) a - copy of the Corresponding Source for all the software in the - product that is covered by this License, on a durable physical - medium customarily used for software interchange, for a price no - more than your reasonable cost of physically performing this - conveying of source, or (2) access to copy the - Corresponding Source from a network server at no charge. - - c) Convey individual copies of the object code with a copy of the - written offer to provide the Corresponding Source. This - alternative is allowed only occasionally and noncommercially, and - only if you received the object code with such an offer, in accord - with subsection 6b. - - d) Convey the object code by offering access from a designated - place (gratis or for a charge), and offer equivalent access to the - Corresponding Source in the same way through the same place at no - further charge. You need not require recipients to copy the - Corresponding Source along with the object code. If the place to - copy the object code is a network server, the Corresponding Source - may be on a different server (operated by you or a third party) - that supports equivalent copying facilities, provided you maintain - clear directions next to the object code saying where to find the - Corresponding Source. Regardless of what server hosts the - Corresponding Source, you remain obligated to ensure that it is - available for as long as needed to satisfy these requirements. - - e) Convey the object code using peer-to-peer transmission, provided - you inform other peers where the object code and Corresponding - Source of the work are being offered to the general public at no - charge under subsection 6d. - - A separable portion of the object code, whose source code is excluded -from the Corresponding Source as a System Library, need not be -included in conveying the object code work. - - A "User Product" is either (1) a "consumer product", which means any -tangible personal property which is normally used for personal, family, -or household purposes, or (2) anything designed or sold for incorporation -into a dwelling. In determining whether a product is a consumer product, -doubtful cases shall be resolved in favor of coverage. For a particular -product received by a particular user, "normally used" refers to a -typical or common use of that class of product, regardless of the status -of the particular user or of the way in which the particular user -actually uses, or expects or is expected to use, the product. A product -is a consumer product regardless of whether the product has substantial -commercial, industrial or non-consumer uses, unless such uses represent -the only significant mode of use of the product. - - "Installation Information" for a User Product means any methods, -procedures, authorization keys, or other information required to install -and execute modified versions of a covered work in that User Product from -a modified version of its Corresponding Source. The information must -suffice to ensure that the continued functioning of the modified object -code is in no case prevented or interfered with solely because -modification has been made. - - If you convey an object code work under this section in, or with, or -specifically for use in, a User Product, and the conveying occurs as -part of a transaction in which the right of possession and use of the -User Product is transferred to the recipient in perpetuity or for a -fixed term (regardless of how the transaction is characterized), the -Corresponding Source conveyed under this section must be accompanied -by the Installation Information. But this requirement does not apply -if neither you nor any third party retains the ability to install -modified object code on the User Product (for example, the work has -been installed in ROM). - - The requirement to provide Installation Information does not include a -requirement to continue to provide support service, warranty, or updates -for a work that has been modified or installed by the recipient, or for -the User Product in which it has been modified or installed. Access to a -network may be denied when the modification itself materially and -adversely affects the operation of the network or violates the rules and -protocols for communication across the network. - - Corresponding Source conveyed, and Installation Information provided, -in accord with this section must be in a format that is publicly -documented (and with an implementation available to the public in -source code form), and must require no special password or key for -unpacking, reading or copying. - - 7. Additional Terms. - - "Additional permissions" are terms that supplement the terms of this -License by making exceptions from one or more of its conditions. -Additional permissions that are applicable to the entire Program shall -be treated as though they were included in this License, to the extent -that they are valid under applicable law. If additional permissions -apply only to part of the Program, that part may be used separately -under those permissions, but the entire Program remains governed by -this License without regard to the additional permissions. - - When you convey a copy of a covered work, you may at your option -remove any additional permissions from that copy, or from any part of -it. (Additional permissions may be written to require their own -removal in certain cases when you modify the work.) You may place -additional permissions on material, added by you to a covered work, -for which you have or can give appropriate copyright permission. - - Notwithstanding any other provision of this License, for material you -add to a covered work, you may (if authorized by the copyright holders of -that material) supplement the terms of this License with terms: - - a) Disclaiming warranty or limiting liability differently from the - terms of sections 15 and 16 of this License; or - - b) Requiring preservation of specified reasonable legal notices or - author attributions in that material or in the Appropriate Legal - Notices displayed by works containing it; or - - c) Prohibiting misrepresentation of the origin of that material, or - requiring that modified versions of such material be marked in - reasonable ways as different from the original version; or - - d) Limiting the use for publicity purposes of names of licensors or - authors of the material; or - - e) Declining to grant rights under trademark law for use of some - trade names, trademarks, or service marks; or - - f) Requiring indemnification of licensors and authors of that - material by anyone who conveys the material (or modified versions of - it) with contractual assumptions of liability to the recipient, for - any liability that these contractual assumptions directly impose on - those licensors and authors. - - All other non-permissive additional terms are considered "further -restrictions" within the meaning of section 10. If the Program as you -received it, or any part of it, contains a notice stating that it is -governed by this License along with a term that is a further -restriction, you may remove that term. If a license document contains -a further restriction but permits relicensing or conveying under this -License, you may add to a covered work material governed by the terms -of that license document, provided that the further restriction does -not survive such relicensing or conveying. - - If you add terms to a covered work in accord with this section, you -must place, in the relevant source files, a statement of the -additional terms that apply to those files, or a notice indicating -where to find the applicable terms. - - Additional terms, permissive or non-permissive, may be stated in the -form of a separately written license, or stated as exceptions; -the above requirements apply either way. - - 8. Termination. - - You may not propagate or modify a covered work except as expressly -provided under this License. Any attempt otherwise to propagate or -modify it is void, and will automatically terminate your rights under -this License (including any patent licenses granted under the third -paragraph of section 11). - - However, if you cease all violation of this License, then your -license from a particular copyright holder is reinstated (a) -provisionally, unless and until the copyright holder explicitly and -finally terminates your license, and (b) permanently, if the copyright -holder fails to notify you of the violation by some reasonable means -prior to 60 days after the cessation. - - Moreover, your license from a particular copyright holder is -reinstated permanently if the copyright holder notifies you of the -violation by some reasonable means, this is the first time you have -received notice of violation of this License (for any work) from that -copyright holder, and you cure the violation prior to 30 days after -your receipt of the notice. - - Termination of your rights under this section does not terminate the -licenses of parties who have received copies or rights from you under -this License. If your rights have been terminated and not permanently -reinstated, you do not qualify to receive new licenses for the same -material under section 10. - - 9. Acceptance Not Required for Having Copies. - - You are not required to accept this License in order to receive or -run a copy of the Program. Ancillary propagation of a covered work -occurring solely as a consequence of using peer-to-peer transmission -to receive a copy likewise does not require acceptance. However, -nothing other than this License grants you permission to propagate or -modify any covered work. These actions infringe copyright if you do -not accept this License. Therefore, by modifying or propagating a -covered work, you indicate your acceptance of this License to do so. - - 10. Automatic Licensing of Downstream Recipients. - - Each time you convey a covered work, the recipient automatically -receives a license from the original licensors, to run, modify and -propagate that work, subject to this License. You are not responsible -for enforcing compliance by third parties with this License. - - An "entity transaction" is a transaction transferring control of an -organization, or substantially all assets of one, or subdividing an -organization, or merging organizations. If propagation of a covered -work results from an entity transaction, each party to that -transaction who receives a copy of the work also receives whatever -licenses to the work the party's predecessor in interest had or could -give under the previous paragraph, plus a right to possession of the -Corresponding Source of the work from the predecessor in interest, if -the predecessor has it or can get it with reasonable efforts. - - You may not impose any further restrictions on the exercise of the -rights granted or affirmed under this License. For example, you may -not impose a license fee, royalty, or other charge for exercise of -rights granted under this License, and you may not initiate litigation -(including a cross-claim or counterclaim in a lawsuit) alleging that -any patent claim is infringed by making, using, selling, offering for -sale, or importing the Program or any portion of it. - - 11. Patents. - - A "contributor" is a copyright holder who authorizes use under this -License of the Program or a work on which the Program is based. The -work thus licensed is called the contributor's "contributor version". - - A contributor's "essential patent claims" are all patent claims -owned or controlled by the contributor, whether already acquired or -hereafter acquired, that would be infringed by some manner, permitted -by this License, of making, using, or selling its contributor version, -but do not include claims that would be infringed only as a -consequence of further modification of the contributor version. For -purposes of this definition, "control" includes the right to grant -patent sublicenses in a manner consistent with the requirements of -this License. - - Each contributor grants you a non-exclusive, worldwide, royalty-free -patent license under the contributor's essential patent claims, to -make, use, sell, offer for sale, import and otherwise run, modify and -propagate the contents of its contributor version. - - In the following three paragraphs, a "patent license" is any express -agreement or commitment, however denominated, not to enforce a patent -(such as an express permission to practice a patent or covenant not to -sue for patent infringement). To "grant" such a patent license to a -party means to make such an agreement or commitment not to enforce a -patent against the party. - - If you convey a covered work, knowingly relying on a patent license, -and the Corresponding Source of the work is not available for anyone -to copy, free of charge and under the terms of this License, through a -publicly available network server or other readily accessible means, -then you must either (1) cause the Corresponding Source to be so -available, or (2) arrange to deprive yourself of the benefit of the -patent license for this particular work, or (3) arrange, in a manner -consistent with the requirements of this License, to extend the patent -license to downstream recipients. "Knowingly relying" means you have -actual knowledge that, but for the patent license, your conveying the -covered work in a country, or your recipient's use of the covered work -in a country, would infringe one or more identifiable patents in that -country that you have reason to believe are valid. - - If, pursuant to or in connection with a single transaction or -arrangement, you convey, or propagate by procuring conveyance of, a -covered work, and grant a patent license to some of the parties -receiving the covered work authorizing them to use, propagate, modify -or convey a specific copy of the covered work, then the patent license -you grant is automatically extended to all recipients of the covered -work and works based on it. - - A patent license is "discriminatory" if it does not include within -the scope of its coverage, prohibits the exercise of, or is -conditioned on the non-exercise of one or more of the rights that are -specifically granted under this License. You may not convey a covered -work if you are a party to an arrangement with a third party that is -in the business of distributing software, under which you make payment -to the third party based on the extent of your activity of conveying -the work, and under which the third party grants, to any of the -parties who would receive the covered work from you, a discriminatory -patent license (a) in connection with copies of the covered work -conveyed by you (or copies made from those copies), or (b) primarily -for and in connection with specific products or compilations that -contain the covered work, unless you entered into that arrangement, -or that patent license was granted, prior to 28 March 2007. - - Nothing in this License shall be construed as excluding or limiting -any implied license or other defenses to infringement that may -otherwise be available to you under applicable patent law. - - 12. No Surrender of Others' Freedom. - - If conditions are imposed on you (whether by court order, agreement or -otherwise) that contradict the conditions of this License, they do not -excuse you from the conditions of this License. If you cannot convey a -covered work so as to satisfy simultaneously your obligations under this -License and any other pertinent obligations, then as a consequence you may -not convey it at all. For example, if you agree to terms that obligate you -to collect a royalty for further conveying from those to whom you convey -the Program, the only way you could satisfy both those terms and this -License would be to refrain entirely from conveying the Program. - - 13. Use with the GNU Affero General Public License. - - Notwithstanding any other provision of this License, you have -permission to link or combine any covered work with a work licensed -under version 3 of the GNU Affero General Public License into a single -combined work, and to convey the resulting work. The terms of this -License will continue to apply to the part which is the covered work, -but the special requirements of the GNU Affero General Public License, -section 13, concerning interaction through a network will apply to the -combination as such. - - 14. Revised Versions of this License. - - The Free Software Foundation may publish revised and/or new versions of -the GNU General Public License from time to time. Such new versions will -be similar in spirit to the present version, but may differ in detail to -address new problems or concerns. - - Each version is given a distinguishing version number. If the -Program specifies that a certain numbered version of the GNU General -Public License "or any later version" applies to it, you have the -option of following the terms and conditions either of that numbered -version or of any later version published by the Free Software -Foundation. If the Program does not specify a version number of the -GNU General Public License, you may choose any version ever published -by the Free Software Foundation. - - If the Program specifies that a proxy can decide which future -versions of the GNU General Public License can be used, that proxy's -public statement of acceptance of a version permanently authorizes you -to choose that version for the Program. - - Later license versions may give you additional or different -permissions. However, no additional obligations are imposed on any -author or copyright holder as a result of your choosing to follow a -later version. - - 15. Disclaimer of Warranty. - - THERE IS NO WARRANTY FOR THE PROGRAM, TO THE EXTENT PERMITTED BY -APPLICABLE LAW. EXCEPT WHEN OTHERWISE STATED IN WRITING THE COPYRIGHT -HOLDERS AND/OR OTHER PARTIES PROVIDE THE PROGRAM "AS IS" WITHOUT WARRANTY -OF ANY KIND, EITHER EXPRESSED OR IMPLIED, INCLUDING, BUT NOT LIMITED TO, -THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR -PURPOSE. THE ENTIRE RISK AS TO THE QUALITY AND PERFORMANCE OF THE PROGRAM -IS WITH YOU. SHOULD THE PROGRAM PROVE DEFECTIVE, YOU ASSUME THE COST OF -ALL NECESSARY SERVICING, REPAIR OR CORRECTION. - - 16. Limitation of Liability. - - IN NO EVENT UNLESS REQUIRED BY APPLICABLE LAW OR AGREED TO IN WRITING -WILL ANY COPYRIGHT HOLDER, OR ANY OTHER PARTY WHO MODIFIES AND/OR CONVEYS -THE PROGRAM AS PERMITTED ABOVE, BE LIABLE TO YOU FOR DAMAGES, INCLUDING ANY -GENERAL, SPECIAL, INCIDENTAL OR CONSEQUENTIAL DAMAGES ARISING OUT OF THE -USE OR INABILITY TO USE THE PROGRAM (INCLUDING BUT NOT LIMITED TO LOSS OF -DATA OR DATA BEING RENDERED INACCURATE OR LOSSES SUSTAINED BY YOU OR THIRD -PARTIES OR A FAILURE OF THE PROGRAM TO OPERATE WITH ANY OTHER PROGRAMS), -EVEN IF SUCH HOLDER OR OTHER PARTY HAS BEEN ADVISED OF THE POSSIBILITY OF -SUCH DAMAGES. - - 17. Interpretation of Sections 15 and 16. - - If the disclaimer of warranty and limitation of liability provided -above cannot be given local legal effect according to their terms, -reviewing courts shall apply local law that most closely approximates -an absolute waiver of all civil liability in connection with the -Program, unless a warranty or assumption of liability accompanies a -copy of the Program in return for a fee. - - END OF TERMS AND CONDITIONS - - How to Apply These Terms to Your New Programs - - If you develop a new program, and you want it to be of the greatest -possible use to the public, the best way to achieve this is to make it -free software which everyone can redistribute and change under these terms. - - To do so, attach the following notices to the program. It is safest -to attach them to the start of each source file to most effectively -state the exclusion of warranty; and each file should have at least -the "copyright" line and a pointer to where the full notice is found. - - - Copyright (C) - - This program is free software: you can redistribute it and/or modify - it under the terms of the GNU General Public License as published by - the Free Software Foundation, either version 3 of the License, or - (at your option) any later version. - - This program is distributed in the hope that it will be useful, - but WITHOUT ANY WARRANTY; without even the implied warranty of - MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the - GNU General Public License for more details. - - You should have received a copy of the GNU General Public License - along with this program. If not, see . - -Also add information on how to contact you by electronic and paper mail. - - If the program does terminal interaction, make it output a short -notice like this when it starts in an interactive mode: - - Copyright (C) - This program comes with ABSOLUTELY NO WARRANTY; for details type `show w'. - This is free software, and you are welcome to redistribute it - under certain conditions; type `show c' for details. - -The hypothetical commands `show w' and `show c' should show the appropriate -parts of the General Public License. Of course, your program's commands -might be different; for a GUI interface, you would use an "about box". - - You should also get your employer (if you work as a programmer) or school, -if any, to sign a "copyright disclaimer" for the program, if necessary. -For more information on this, and how to apply and follow the GNU GPL, see -. - - The GNU General Public License does not permit incorporating your program -into proprietary programs. If your program is a subroutine library, you -may consider it more useful to permit linking proprietary applications with -the library. If this is what you want to do, use the GNU Lesser General -Public License instead of this License. But first, please read -. diff --git a/T5/loader.py b/T5/loader.py deleted file mode 100644 index db321ab..0000000 --- a/T5/loader.py +++ /dev/null @@ -1,118 +0,0 @@ -import os -import torch -import comfy.utils -import comfy.model_patcher -from comfy import model_management -import folder_paths - -from .t5v11 import T5v11Model, T5v11Tokenizer - -class EXM_T5v11: - def __init__(self, textmodel_ver="xxl", embedding_directory=None, textmodel_path=None, no_init=False, device="cpu", dtype=None): - if no_init: - return - - if device == "auto": - size = 0 - self.load_device = model_management.text_encoder_device() - self.offload_device = model_management.text_encoder_offload_device() - self.init_device = "cpu" - elif dtype == "bnb8bit": - # BNB doesn't support size enum - size = 12.4 * (1024**3) - # Or moving between devices - self.load_device = model_management.get_torch_device() - self.offload_device = self.load_device - self.init_device = self.load_device - elif dtype == "bnb4bit": - # This seems to use the same VRAM as 8bit on Pascal? - size = 6.2 * (1024**3) - self.load_device = model_management.get_torch_device() - self.offload_device = self.load_device - self.init_device = self.load_device - elif device == "cpu": - size = 0 - self.load_device = "cpu" - self.offload_device = "cpu" - self.init_device="cpu" - elif device.startswith("cuda"): - print("Direct CUDA device override!\nVRAM will not be freed by default.") - size = 0 - self.load_device = device - self.offload_device = device - self.init_device = device - else: - size = 0 - self.load_device = model_management.get_torch_device() - self.offload_device = "cpu" - self.init_device="cpu" - - self.cond_stage_model = T5v11Model( - textmodel_ver = textmodel_ver, - textmodel_path = textmodel_path, - device = device, - dtype = dtype, - ) - self.tokenizer = T5v11Tokenizer(embedding_directory=embedding_directory) - self.patcher = comfy.model_patcher.ModelPatcher( - self.cond_stage_model, - load_device = self.load_device, - offload_device = self.offload_device, - size = size, - ) - - def clone(self): - n = T5(no_init=True) - n.patcher = self.patcher.clone() - n.cond_stage_model = self.cond_stage_model - n.tokenizer = self.tokenizer - return n - - def tokenize(self, text, return_word_ids=False): - return self.tokenizer.tokenize_with_weights(text, return_word_ids) - - def encode_from_tokens(self, tokens): - self.load_model() - return self.cond_stage_model.encode_token_weights(tokens) - - def encode(self, text): - tokens = self.tokenize(text) - return self.encode_from_tokens(tokens) - - def load_sd(self, sd): - return self.cond_stage_model.load_sd(sd) - - def get_sd(self): - return self.cond_stage_model.state_dict() - - def load_model(self): - if self.load_device != "cpu": - model_management.load_model_gpu(self.patcher) - return self.patcher - - def add_patches(self, patches, strength_patch=1.0, strength_model=1.0): - return self.patcher.add_patches(patches, strength_patch, strength_model) - - def get_key_patches(self): - return self.patcher.get_key_patches() - - -def load_t5(model_type, model_ver, model_path, path_type="file", device="cpu", dtype=None): - assert model_type in ["t5v11"] # Only supported model for now - model_args = { - "textmodel_ver" : model_ver, - "device" : device, - "dtype" : dtype, - } - - if path_type == "folder": - # pass directly to transformers and initialize there - # this is to avoid having to handle multi-file state dict loading for now. - model_args["textmodel_path"] = os.path.dirname(model_path) - return EXM_T5v11(**model_args) - else: - # for some reason this returns garbage with torch.int8 weights, or just OOMs - model = EXM_T5v11(**model_args) - sd = comfy.utils.load_torch_file(model_path) - model.load_sd(sd) - return model diff --git a/T5/nodes.py b/T5/nodes.py deleted file mode 100644 index 021386e..0000000 --- a/T5/nodes.py +++ /dev/null @@ -1,95 +0,0 @@ -import os -import json -import torch -import folder_paths - -from .loader import load_t5 -from ..utils.dtype import string_to_dtype - -# initialize custom folder path -os.makedirs( - os.path.join(folder_paths.models_dir,"t5"), - exist_ok = True, -) -folder_paths.folder_names_and_paths["t5"] = ( - [ - os.path.join(folder_paths.models_dir,"t5"), - *folder_paths.folder_names_and_paths.get("t5", [[],set()])[0] - ], - folder_paths.supported_pt_extensions -) - -dtypes = [ - "default", - "auto (comfy)", - "FP32", - "FP16", - # Note: remove these at some point - "bnb8bit", - "bnb4bit", -] -try: torch.float8_e5m2 -except AttributeError: print("Torch version too old for FP8") -else: dtypes += ["FP8 E4M3", "FP8 E5M2"] - -class T5v11Loader: - @classmethod - def INPUT_TYPES(s): - devices = ["auto", "cpu", "gpu"] - # hack for using second GPU as offload - for k in range(1, torch.cuda.device_count()): - devices.append(f"cuda:{k}") - return { - "required": { - "t5v11_name": (folder_paths.get_filename_list("t5"),), - "t5v11_ver": (["xxl"],), - "path_type": (["folder", "file"],), - "device": (devices, {"default":"cpu"}), - "dtype": (dtypes,), - } - } - RETURN_TYPES = ("T5",) - FUNCTION = "load_model" - CATEGORY = "ExtraModels/T5" - TITLE = "T5v1.1 Loader" - - def load_model(self, t5v11_name, t5v11_ver, path_type, device, dtype): - if "bnb" in dtype: - assert device == "gpu" or device.startswith("cuda"), "BitsAndBytes only works on CUDA! Set device to 'gpu'." - dtype = string_to_dtype(dtype, "text_encoder") - if device == "cpu": - assert dtype in [None, torch.float32], f"Can't use dtype '{dtype}' with CPU! Set dtype to 'default'." - - return (load_t5( - model_type = "t5v11", - model_ver = t5v11_ver, - model_path = folder_paths.get_full_path("t5", t5v11_name), - path_type = path_type, - device = device, - dtype = dtype, - ),) - -class T5TextEncode: - @classmethod - def INPUT_TYPES(s): - return { - "required": { - "text": ("STRING", {"multiline": True}), - "T5": ("T5",), - } - } - - RETURN_TYPES = ("CONDITIONING",) - FUNCTION = "encode" - CATEGORY = "ExtraModels/T5" - TITLE = "T5 Text Encode" - - def encode(self, text, T5=None): - tokens = T5.tokenize(text) - cond = T5.encode_from_tokens(tokens) - return ([[cond, {}]], ) - -NODE_CLASS_MAPPINGS = { - "T5v11Loader" : T5v11Loader, - "T5TextEncode" : T5TextEncode, -} diff --git a/T5/t5v11-xxl_config.json b/T5/t5v11-xxl_config.json deleted file mode 100644 index d133daa..0000000 --- a/T5/t5v11-xxl_config.json +++ /dev/null @@ -1,31 +0,0 @@ -{ - "_name_or_path": "google/t5-v1_1-xxl", - "architectures": [ - "T5EncoderModel" - ], - "d_ff": 10240, - "d_kv": 64, - "d_model": 4096, - "decoder_start_token_id": 0, - "dense_act_fn": "gelu_new", - "dropout_rate": 0.1, - "eos_token_id": 1, - "feed_forward_proj": "gated-gelu", - "initializer_factor": 1.0, - "is_encoder_decoder": true, - "is_gated_act": true, - "layer_norm_epsilon": 1e-06, - "model_type": "t5", - "num_decoder_layers": 24, - "num_heads": 64, - "num_layers": 24, - "output_past": true, - "pad_token_id": 0, - "relative_attention_max_distance": 128, - "relative_attention_num_buckets": 32, - "tie_word_embeddings": false, - "torch_dtype": "float32", - "transformers_version": "4.21.1", - "use_cache": true, - "vocab_size": 32128 -} diff --git a/T5/t5v11.py b/T5/t5v11.py deleted file mode 100644 index 76cec9c..0000000 --- a/T5/t5v11.py +++ /dev/null @@ -1,227 +0,0 @@ -""" -Adapted from comfyui CLIP code. -https://github.com/comfyanonymous/ComfyUI/blob/master/comfy/sd1_clip.py -""" - -import os - -from transformers import T5Tokenizer, T5EncoderModel, T5Config, modeling_utils -import torch -import traceback -import zipfile -from comfy import model_management - -from comfy.sd1_clip import parse_parentheses, token_weights, escape_important, unescape_important, safe_load_embed_zip, expand_directory_list, load_embed - -class T5v11Model(torch.nn.Module): - def __init__(self, textmodel_ver="xxl", textmodel_json_config=None, textmodel_path=None, device="cpu", max_length=120, freeze=True, dtype=None): - super().__init__() - - self.num_layers = 24 - self.max_length = max_length - self.bnb = False - - if textmodel_path is not None: - model_args = {} - model_args["low_cpu_mem_usage"] = True # Don't take 2x system ram on cpu - if dtype == "bnb8bit": - self.bnb = True - model_args["load_in_8bit"] = True - elif dtype == "bnb4bit": - self.bnb = True - model_args["load_in_4bit"] = True - else: - if dtype: model_args["torch_dtype"] = dtype - self.bnb = False - # second GPU offload hack part 2 - if device.startswith("cuda"): - model_args["device_map"] = device - print(f"Loading T5 from '{textmodel_path}'") - self.transformer = T5EncoderModel.from_pretrained(textmodel_path, **model_args) - else: - if textmodel_json_config is None: - textmodel_json_config = os.path.join( - os.path.dirname(os.path.realpath(__file__)), - f"t5v11-{textmodel_ver}_config.json" - ) - config = T5Config.from_json_file(textmodel_json_config) - self.num_layers = config.num_hidden_layers - with modeling_utils.no_init_weights(): - self.transformer = T5EncoderModel(config) - - if freeze: - self.freeze() - self.empty_tokens = [[0] * self.max_length] # token - - def freeze(self): - self.transformer = self.transformer.eval() - for param in self.parameters(): - param.requires_grad = False - - def forward(self, tokens): - device = self.transformer.get_input_embeddings().weight.device - tokens = torch.LongTensor(tokens).to(device) - attention_mask = torch.zeros_like(tokens) - max_token = 1 # token - for x in range(attention_mask.shape[0]): - for y in range(attention_mask.shape[1]): - attention_mask[x, y] = 1 - if tokens[x, y] == max_token: - break - - outputs = self.transformer(input_ids=tokens, attention_mask=attention_mask) - - z = outputs['last_hidden_state'] - z.detach().cpu().float() - return z - - def encode(self, tokens): - return self(tokens) - - def load_sd(self, sd): - return self.transformer.load_state_dict(sd, strict=False) - - def to(self, *args, **kwargs): - """BNB complains if you try to change the device or dtype""" - if self.bnb: - print("Thanks to BitsAndBytes, T5 becomes an immovable rock.", args, kwargs) - else: - self.transformer.to(*args, **kwargs) - - def encode_token_weights(self, token_weight_pairs, return_padded=False): - to_encode = list(self.empty_tokens) - for x in token_weight_pairs: - tokens = list(map(lambda a: a[0], x)) - to_encode.append(tokens) - - out = self.encode(to_encode) - z_empty = out[0:1] - - output = [] - for k in range(1, out.shape[0]): - z = out[k:k+1] - for i in range(len(z)): - for j in range(len(z[i])): - weight = token_weight_pairs[k - 1][j][1] - z[i][j] = (z[i][j] - z_empty[0][j]) * weight + z_empty[0][j] - output.append(z) - - if (len(output) == 0): - return z_empty.cpu() - - out = torch.cat(output, dim=-2) - if not return_padded: - # Count number of tokens that aren't , then use that number as an index. - keep_index = sum([sum([1 for y in x if y[0] != 0]) for x in token_weight_pairs]) - out = out[:, :keep_index, :] - return out - - -class T5v11Tokenizer: - """ - This is largely just based on the ComfyUI CLIP code. - """ - def __init__(self, tokenizer_path=None, max_length=120, embedding_directory=None, embedding_size=4096, embedding_key='t5'): - if tokenizer_path is None: - tokenizer_path = os.path.join(os.path.dirname(os.path.realpath(__file__)), "t5_tokenizer") - self.tokenizer = T5Tokenizer.from_pretrained(tokenizer_path) - self.max_length = max_length - self.max_tokens_per_section = self.max_length - 1 # but no - - self.pad_token = self.tokenizer("", add_special_tokens=False)["input_ids"][0] - self.end_token = self.tokenizer("", add_special_tokens=False)["input_ids"][0] - vocab = self.tokenizer.get_vocab() - self.inv_vocab = {v: k for k, v in vocab.items()} - self.embedding_directory = embedding_directory - self.max_word_length = 8 # haven't verified this - self.embedding_identifier = "embedding:" - self.embedding_size = embedding_size - self.embedding_key = embedding_key - - def _try_get_embedding(self, embedding_name:str): - ''' - Takes a potential embedding name and tries to retrieve it. - Returns a Tuple consisting of the embedding and any leftover string, embedding can be None. - ''' - embed = load_embed(embedding_name, self.embedding_directory, self.embedding_size, self.embedding_key) - if embed is None: - stripped = embedding_name.strip(',') - if len(stripped) < len(embedding_name): - embed = load_embed(stripped, self.embedding_directory, self.embedding_size, self.embedding_key) - return (embed, embedding_name[len(stripped):]) - return (embed, "") - - def tokenize_with_weights(self, text:str, return_word_ids=False): - ''' - Takes a prompt and converts it to a list of (token, weight, word id) elements. - Tokens can both be integer tokens and pre computed T5 tensors. - Word id values are unique per word and embedding, where the id 0 is reserved for non word tokens. - Returned list has the dimensions NxM where M is the input size of T5 - ''' - pad_token = self.pad_token - text = escape_important(text) - parsed_weights = token_weights(text, 1.0) - - #tokenize words - tokens = [] - for weighted_segment, weight in parsed_weights: - to_tokenize = unescape_important(weighted_segment).replace("\n", " ").split(' ') - to_tokenize = [x for x in to_tokenize if x != ""] - for word in to_tokenize: - #if we find an embedding, deal with the embedding - if word.startswith(self.embedding_identifier) and self.embedding_directory is not None: - embedding_name = word[len(self.embedding_identifier):].strip('\n') - embed, leftover = self._try_get_embedding(embedding_name) - if embed is None: - print(f"warning, embedding:{embedding_name} does not exist, ignoring") - else: - if len(embed.shape) == 1: - tokens.append([(embed, weight)]) - else: - tokens.append([(embed[x], weight) for x in range(embed.shape[0])]) - #if we accidentally have leftover text, continue parsing using leftover, else move on to next word - if leftover != "": - word = leftover - else: - continue - #parse word - tokens.append([(t, weight) for t in self.tokenizer(word, add_special_tokens=False)["input_ids"]]) - - #reshape token array to T5 input size - batched_tokens = [] - batch = [] - batched_tokens.append(batch) - for i, t_group in enumerate(tokens): - #determine if we're going to try and keep the tokens in a single batch - is_large = len(t_group) >= self.max_word_length - - while len(t_group) > 0: - if len(t_group) + len(batch) > self.max_length - 1: - remaining_length = self.max_length - len(batch) - 1 - #break word in two and add end token - if is_large: - batch.extend([(t,w,i+1) for t,w in t_group[:remaining_length]]) - batch.append((self.end_token, 1.0, 0)) - t_group = t_group[remaining_length:] - #add end token and pad - else: - batch.append((self.end_token, 1.0, 0)) - batch.extend([(self.pad_token, 1.0, 0)] * (remaining_length)) - #start new batch - batch = [] - batched_tokens.append(batch) - else: - batch.extend([(t,w,i+1) for t,w in t_group]) - t_group = [] - - # fill last batch - batch.extend([(self.end_token, 1.0, 0)] + [(self.pad_token, 1.0, 0)] * (self.max_length - len(batch) - 1)) - # instead of filling, just add EOS (DEBUG) - # batch.extend([(self.end_token, 1.0, 0)]) - - if not return_word_ids: - batched_tokens = [[(t, w) for t, w,_ in x] for x in batched_tokens] - return batched_tokens - - def untokenize(self, token_weight_pair): - return list(map(lambda a: (a, self.inv_vocab[a[0]]), token_weight_pair)) diff --git a/__init__.py b/__init__.py index 1fff84c..2f9bee0 100644 --- a/__init__.py +++ b/__init__.py @@ -1,51 +1,42 @@ # only import if running as a custom node try: - import comfy.utils + import comfy.utils except ImportError: - pass + pass else: - NODE_CLASS_MAPPINGS = {} + NODE_CLASS_MAPPINGS = {} - # Deci Diffusion - # from .DeciDiffusion.nodes import NODE_CLASS_MAPPINGS as DeciDiffusion_Nodes - # NODE_CLASS_MAPPINGS.update(DeciDiffusion_Nodes) + # Generic/universal nodes + from .nodes import NODE_CLASS_MAPPINGS as Base_Nodes + NODE_CLASS_MAPPINGS.update(Base_Nodes) - # DiT - from .DiT.nodes import NODE_CLASS_MAPPINGS as DiT_Nodes - NODE_CLASS_MAPPINGS.update(DiT_Nodes) + # DiT + from .DiT.nodes import NODE_CLASS_MAPPINGS as DiT_Nodes + NODE_CLASS_MAPPINGS.update(DiT_Nodes) - # PixArt - from .PixArt.nodes import NODE_CLASS_MAPPINGS as PixArt_Nodes - NODE_CLASS_MAPPINGS.update(PixArt_Nodes) + # PixArt + from .PixArt.nodes import NODE_CLASS_MAPPINGS as PixArt_Nodes + NODE_CLASS_MAPPINGS.update(PixArt_Nodes) - # T5 - from .T5.nodes import NODE_CLASS_MAPPINGS as T5_Nodes - NODE_CLASS_MAPPINGS.update(T5_Nodes) - - # HYDiT - from .HunYuanDiT.nodes import NODE_CLASS_MAPPINGS as HunYuanDiT_Nodes - NODE_CLASS_MAPPINGS.update(HunYuanDiT_Nodes) - - # VAE - from .VAE.nodes import NODE_CLASS_MAPPINGS as VAE_Nodes - NODE_CLASS_MAPPINGS.update(VAE_Nodes) + # VAE + from .VAE.nodes import NODE_CLASS_MAPPINGS as VAE_Nodes + NODE_CLASS_MAPPINGS.update(VAE_Nodes) - # MiaoBi - from .MiaoBi.nodes import NODE_CLASS_MAPPINGS as MiaoBi_Nodes - NODE_CLASS_MAPPINGS.update(MiaoBi_Nodes) - - # Extra - from .utils.nodes import NODE_CLASS_MAPPINGS as Extra_Nodes - NODE_CLASS_MAPPINGS.update(Extra_Nodes) + # MiaoBi + from .MiaoBi.nodes import NODE_CLASS_MAPPINGS as MiaoBi_Nodes + NODE_CLASS_MAPPINGS.update(MiaoBi_Nodes) - # Sana - from .Sana.nodes import NODE_CLASS_MAPPINGS as Sana_Nodes - NODE_CLASS_MAPPINGS.update(Sana_Nodes) + # Extra + from .utils.nodes import NODE_CLASS_MAPPINGS as Extra_Nodes + NODE_CLASS_MAPPINGS.update(Extra_Nodes) - # Gemma - from .Gemma.nodes import NODE_CLASS_MAPPINGS as Gemma_Nodes - NODE_CLASS_MAPPINGS.update(Gemma_Nodes) + # Sana + from .Sana.nodes import NODE_CLASS_MAPPINGS as Sana_Nodes + NODE_CLASS_MAPPINGS.update(Sana_Nodes) - NODE_DISPLAY_NAME_MAPPINGS = {k:v.TITLE for k,v in NODE_CLASS_MAPPINGS.items()} - __all__ = ['NODE_CLASS_MAPPINGS', 'NODE_DISPLAY_NAME_MAPPINGS'] + # Gemma + from .Gemma.nodes import NODE_CLASS_MAPPINGS as Gemma_Nodes + NODE_CLASS_MAPPINGS.update(Gemma_Nodes) + NODE_DISPLAY_NAME_MAPPINGS = {k:v.TITLE for k,v in NODE_CLASS_MAPPINGS.items()} + __all__ = ['NODE_CLASS_MAPPINGS', 'NODE_DISPLAY_NAME_MAPPINGS'] diff --git a/nodes.py b/nodes.py new file mode 100644 index 0000000..c7ba2f5 --- /dev/null +++ b/nodes.py @@ -0,0 +1,70 @@ +import folder_paths +import comfy.utils + +from .PixArt.loader import load_pixart_state_dict +from .Sana.loader import load_sana_state_dict +from .text_encoders.tenc import load_text_encoder, tenc_names + +loaders = { + "PixArt": load_pixart_state_dict, + "Sana": load_sana_state_dict, +} + +class EXMUnetLoader: + @classmethod + def INPUT_TYPES(s): + return { + "required": { + "unet_name": (folder_paths.get_filename_list("unet"),), + "model_type": (list(loaders.keys()),) + } + } + + RETURN_TYPES = ("MODEL",) + FUNCTION = "load_unet" + CATEGORY = "ExtraModels" + TITLE = "Load Diffusion Model (ExtraModels)" + + def load_unet(self, unet_name, model_type): + model_options = {} + unet_path = folder_paths.get_full_path("diffusion_models", unet_name) + loader_fn = loaders[model_type] + sd = comfy.utils.load_torch_file(unet_path) + return (loader_fn(sd),) + +class EXMCLIPLoader: + @classmethod + def INPUT_TYPES(s): + files = [] + files += folder_paths.get_filename_list("clip") + # if "clip_gguf" in folder_paths.folder_names_and_paths: + # files += folder_paths.get_filename_list("clip_gguf") + return { + "required": { + "clip_name": (files, ), + "type": (["PixArt", "MiaoBi", "Sana"],), + } + } + + RETURN_TYPES = ("CLIP",) + FUNCTION = "load_clip" + CATEGORY = "ExtraModels" + TITLE = "CLIPLoader (ExtraModels)" + + def load_clip(self, clip_name, type): + clip_path = folder_paths.get_full_path("clip", clip_name) + clip_type = tenc_names.get(type, None) + + clip = load_text_encoder( + ckpt_paths =[clip_path], + embedding_directory = folder_paths.get_folder_paths("embeddings"), + clip_type = clip_type + ) + return (clip,) + +#class EXMResolutionSelect: + +NODE_CLASS_MAPPINGS = { + "EXMUnetLoader": EXMUnetLoader, + "EXMCLIPLoader": EXMCLIPLoader, +} diff --git a/requirements.txt b/requirements.txt index 1e54893..017b5b8 100644 --- a/requirements.txt +++ b/requirements.txt @@ -1,4 +1,3 @@ -timm>=0.6.13 sentencepiece>=0.1.97 transformers>=4.34.1 accelerate>=0.23.0 diff --git a/text_encoders/pixart/tenc.py b/text_encoders/pixart/tenc.py new file mode 100644 index 0000000..04ab372 --- /dev/null +++ b/text_encoders/pixart/tenc.py @@ -0,0 +1,43 @@ +import os +import torch + +from comfy import sd1_clip +import comfy.text_encoders.t5 +import comfy.text_encoders.sd3_clip +import comfy.model_management + +from transformers import T5TokenizerFast + +class T5XXLModel(comfy.text_encoders.sd3_clip.T5XXLModel): + def __init__(self, **kwargs): + super().__init__(**kwargs) + # make sure empty tokens match + self.special_tokens.pop("end") + +class PixArtT5XXL(sd1_clip.SD1ClipModel): + def __init__(self, device="cpu", dtype=None, model_options={}): + super().__init__(device=device, dtype=dtype, name="t5xxl", clip_model=T5XXLModel, model_options=model_options) + +class T5XXLTokenizer(sd1_clip.SDTokenizer): + def __init__(self, embedding_directory=None, tokenizer_data={}): + tokenizer_path = os.path.join( + os.path.dirname(os.path.dirname(os.path.realpath(__file__))), + "tokenizers", "t5_tokenizer", + ) + super().__init__(tokenizer_path, embedding_directory=embedding_directory, pad_with_end=False, embedding_size=4096, embedding_key='t5xxl', tokenizer_class=T5TokenizerFast, has_start_token=False, pad_to_max_length=False, max_length=99999999, min_length=1) + +class PixArtTokenizer(sd1_clip.SD1Tokenizer): + def __init__(self, embedding_directory=None, tokenizer_data={}): + super().__init__(embedding_directory=embedding_directory, tokenizer_data=tokenizer_data, clip_name="t5xxl", tokenizer=T5XXLTokenizer) + +# TODO: don't duplicate this? +def pixart_te(dtype_t5=None, t5xxl_scaled_fp8=None): + class PixArtTEModel_(PixArtT5XXL): + def __init__(self, device="cpu", dtype=None, model_options={}): + if t5xxl_scaled_fp8 is not None and "t5xxl_scaled_fp8" not in model_options: + model_options = model_options.copy() + model_options["t5xxl_scaled_fp8"] = t5xxl_scaled_fp8 + if dtype is None: + dtype = dtype_t5 + super().__init__(device=device, dtype=dtype, model_options=model_options) + return PixArtTEModel_ diff --git a/text_encoders/sana/config.json b/text_encoders/sana/config.json new file mode 100644 index 0000000..05131f6 --- /dev/null +++ b/text_encoders/sana/config.json @@ -0,0 +1,35 @@ +{ + "architectures": [ + "Gemma2ForCausalLM" + ], + "attention_bias": false, + "attention_dropout": 0.0, + "attn_logit_softcapping": 50.0, + "bos_token_id": 2, + "cache_implementation": "hybrid", + "eos_token_id": [ + 1, + 107 + ], + "final_logit_softcapping": 30.0, + "head_dim": 256, + "hidden_act": "gelu_pytorch_tanh", + "hidden_activation": "gelu_pytorch_tanh", + "hidden_size": 2304, + "initializer_range": 0.02, + "intermediate_size": 9216, + "max_position_embeddings": 8192, + "model_type": "gemma2", + "num_attention_heads": 8, + "num_hidden_layers": 26, + "num_key_value_heads": 4, + "pad_token_id": 0, + "query_pre_attn_scalar": 256, + "rms_norm_eps": 1e-06, + "rope_theta": 10000.0, + "sliding_window": 4096, + "torch_dtype": "bfloat16", + "transformers_version": "4.42.4", + "use_cache": true, + "vocab_size": 256000 +} diff --git a/text_encoders/sana/gemma.py b/text_encoders/sana/gemma.py new file mode 100644 index 0000000..3b6679c --- /dev/null +++ b/text_encoders/sana/gemma.py @@ -0,0 +1,379 @@ +# Copyright 2024 Google Inc. HuggingFace Inc. team. All rights reserved. +# +# +# Licensed under the Apache License, Version 2.0 (the "License"); +# you may not use this file except in compliance with the License. +# You may obtain a copy of the License at +# +# http://www.apache.org/licenses/LICENSE-2.0 +# +# Unless required by applicable law or agreed to in writing, software +# distributed under the License is distributed on an "AS IS" BASIS, +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. +# See the License for the specific language governing permissions and +# limitations under the License. +import torch +import torch.nn as nn +import importlib + +def rotate_half(x): + x1 = x[..., : x.shape[-1] // 2] + x2 = x[..., x.shape[-1] // 2 :] + return torch.cat((-x2, x1), dim=-1) + +def apply_rotary_pos_emb(q, k, cos, sin, position_ids=None, unsqueeze_dim=1): + cos = cos.unsqueeze(unsqueeze_dim) + sin = sin.unsqueeze(unsqueeze_dim) + q_embed = (q * cos) + (rotate_half(q) * sin) + k_embed = (k * cos) + (rotate_half(k) * sin) + return q_embed, k_embed + +def repeat_kv(hidden_states, n_rep): + batch, num_key_value_heads, slen, head_dim = hidden_states.shape + if n_rep == 1: + return hidden_states + hidden_states = hidden_states[:, :, None, :, :].expand(batch, num_key_value_heads, n_rep, slen, head_dim) + return hidden_states.reshape(batch, num_key_value_heads * n_rep, slen, head_dim) + +def sdpa_attention_forward(config, query, key, value, mask=None, **kwargs): + key = repeat_kv(key, config["num_key_value_groups"]) + value = repeat_kv(value, config["num_key_value_groups"]) + + causal_mask = mask + if mask is not None: + causal_mask = causal_mask[:, :, :, : key.shape[-2]] + + # SDPA with memory-efficient backend is currently (torch==2.1.2) bugged with non-contiguous inputs with custom attn_mask, + # Reference: https://github.com/pytorch/pytorch/issues/112577. + if query.device.type == "cuda" and causal_mask is not None: + query = query.contiguous() + key = key.contiguous() + value = value.contiguous() + + # We dispatch to SDPA's Flash Attention or Efficient kernels via this `is_causal` if statement instead of an inline conditional assignment + # in SDPA to support both torch.compile's dynamic shapes and full graph options. An inline conditional prevents dynamic shapes from compiling. + is_causal = True if causal_mask is None and query.shape[1] > 1 else False + + attn_output = torch.nn.functional.scaled_dot_product_attention( + query, + key, + value, + attn_mask=causal_mask, + dropout_p=0.0, + is_causal=is_causal, + scale=config["scaling"], + ) + attn_output = attn_output.transpose(1, 2).contiguous() + return attn_output, None + +def eager_attention_forward(config, query, key, value, mask, **kwargs): + key_states = repeat_kv(key, config["num_key_value_groups"]) + value_states = repeat_kv(value, config["num_key_value_groups"]) + + attn_weights = torch.matmul(query, key_states.transpose(2, 3)) * config["scaling"] + + if config["attn_logit_softcapping"] is not None: + attn_weights = attn_weights / config["attn_logit_softcapping"] + attn_weights = torch.tanh(attn_weights) + attn_weights = attn_weights * config["attn_logit_softcapping"] + if mask is not None: # no matter the length, we just slice it + causal_mask = mask[:, :, :, : key_states.shape[-2]] + attn_weights = attn_weights + causal_mask + + # upcast attention to fp32 + attn_weights = nn.functional.softmax(attn_weights, dim=-1, dtype=torch.float32).to(query.dtype) + #attn_weights = nn.functional.dropout(attn_weights, p=0, training=config.training) + attn_output = torch.matmul(attn_weights, value_states) + attn_output = attn_output.transpose(1, 2).contiguous() + return attn_output, attn_weights + +# torch 2.0 can't pass scale arg to sdpa +if int((torch.__version__).split(".")[1]) >= 1: + attention_forward = sdpa_attention_forward +else: + attention_forward = eager_attention_forward + +class Gemma2RMSNorm(nn.Module): + def __init__(self, dim, eps=1e-6): + super().__init__() + self.eps = eps + self.weight = nn.Parameter(torch.zeros(dim)) + + def _norm(self, x): + return x * torch.rsqrt(x.pow(2).mean(-1, keepdim=True) + self.eps) + + def forward(self, x): + output = self._norm(x.float()) + # Llama does x.to(float16) * w whilst Gemma2 is (x * w).to(float16) + # See https://github.com/huggingface/transformers/pull/29402 + output = output * (1.0 + self.weight.float()) + return output.type_as(x) + + def extra_repr(self): + return f"{tuple(self.weight.shape)}, eps={self.eps}" + +class Gemma2MLP(nn.Module): + def __init__(self, config, dtype=None, device=None, operations=None): + super().__init__() + self.config = config + self.hidden_size = config["hidden_size"] + self.intermediate_size = config["intermediate_size"] + self.gate_proj = operations.Linear(self.hidden_size, self.intermediate_size, bias=False, dtype=dtype, device=device) + self.up_proj = operations.Linear(self.hidden_size, self.intermediate_size, bias=False, dtype=dtype, device=device) + self.down_proj = operations.Linear(self.intermediate_size, self.hidden_size, bias=False, dtype=dtype, device=device) + if config["hidden_activation"] != "gelu_pytorch_tanh": + raise NotImplementedError("Unknown act mode") + self.act_fn = torch.nn.GELU() + + def forward(self, x): + return self.down_proj(self.act_fn(self.gate_proj(x)) * self.up_proj(x)) + +class Gemma2RotaryEmbedding(nn.Module): + def __init__(self, dim, max_position_embeddings=2048, base=10000): + super().__init__() + + self.dim = dim + self.max_position_embeddings = max_position_embeddings + self.base = base + inv_freq = 1.0 / (self.base ** (torch.arange(0, self.dim, 2, dtype=torch.int64).float() / self.dim)) + self.register_buffer("inv_freq", tensor=inv_freq, persistent=False) + + @torch.no_grad() + def forward(self, x, position_ids, seq_len=None): + # x: [bs, num_attention_heads, seq_len, head_size] + self.inv_freq.to(x.device) + inv_freq_expanded = self.inv_freq[None, :, None].float().expand(position_ids.shape[0], -1, 1) + position_ids_expanded = position_ids[:, None, :].float() + # Force float32 since bfloat16 loses precision on long contexts + # See https://github.com/huggingface/transformers/pull/29285 + device_type = x.device.type + device_type = device_type if isinstance(device_type, str) and device_type != "mps" else "cpu" + with torch.autocast(device_type=device_type, enabled=False): + freqs = (inv_freq_expanded.float() @ position_ids_expanded.float()).transpose(1, 2) + emb = torch.cat((freqs, freqs), dim=-1) + cos = emb.cos() + sin = emb.sin() + return cos.to(dtype=x.dtype), sin.to(dtype=x.dtype) + +class Gemma2Attention(nn.Module): + """Multi-headed attention from 'Attention Is All You Need' paper""" + + def __init__(self, config, layer_idx=None, dtype=None, device=None, operations=None): + super().__init__() + self.config = config + self.layer_idx = layer_idx + + self.attention_dropout = config["attention_dropout"] + self.hidden_size = config["hidden_size"] + self.num_heads = config["num_attention_heads"] + self.head_dim = config["head_dim"] + self.num_key_value_heads = config["num_key_value_heads"] + self.num_key_value_groups = self.num_heads // self.num_key_value_heads + self.max_position_embeddings = config["max_position_embeddings"] + self.rope_theta = config["rope_theta"] + self.is_causal = True + self.scaling = config["query_pre_attn_scalar"]**-0.5 + self.sliding_window = config["sliding_window"] if not bool(layer_idx % 2) else None + self.attn_logit_softcapping = config["attn_logit_softcapping"] + if self.hidden_size % self.num_heads != 0: + raise ValueError( + f"hidden_size must be divisible by num_heads (got `hidden_size`: {self.hidden_size}" + f" and `num_heads`: {self.num_heads})." + ) + + self.q_proj = operations.Linear(self.hidden_size, self.num_heads * self.head_dim, bias=config["attention_bias"], dtype=dtype, device=device) + self.k_proj = operations.Linear(self.hidden_size, self.num_key_value_heads * self.head_dim, bias=config["attention_bias"], dtype=dtype, device=device) + self.v_proj = operations.Linear(self.hidden_size, self.num_key_value_heads * self.head_dim, bias=config["attention_bias"], dtype=dtype, device=device) + self.o_proj = operations.Linear(self.num_heads * self.head_dim, self.hidden_size, bias=config["attention_bias"], dtype=dtype, device=device) + self.rotary_emb = Gemma2RotaryEmbedding( + self.head_dim, + max_position_embeddings=self.max_position_embeddings, + base=self.rope_theta, + ) + + def forward(self, hidden_states, attention_mask=None, position_ids=None, past_key_value=None, output_attentions=False, use_cache=False, cache_position= None): + bsz, q_len, _ = hidden_states.size() + + query_states = self.q_proj(hidden_states) + key_states = self.k_proj(hidden_states) + value_states = self.v_proj(hidden_states) + + query_states = query_states.view(bsz, q_len, self.num_heads, self.head_dim).transpose(1, 2) + key_states = key_states.view(bsz, q_len, self.num_key_value_heads, self.head_dim).transpose(1, 2) + value_states = value_states.view(bsz, q_len, self.num_key_value_heads, self.head_dim).transpose(1, 2) + + cos, sin = self.rotary_emb(value_states, position_ids) + query_states, key_states = apply_rotary_pos_emb(query_states, key_states, cos, sin) + + if past_key_value is not None: + # sin and cos are specific to RoPE models; cache_position needed for the static cache + cache_kwargs = { + "sin": sin, + "cos": cos, + "sliding_window": self.sliding_window, + "cache_position": cache_position, + } + key_states, value_states = past_key_value.update(key_states, value_states, self.layer_idx, cache_kwargs) + + config = { + "scaling": self.scaling, + "num_key_value_groups": self.num_key_value_groups, + "max_position_embeddings": self.max_position_embeddings, + "attn_logit_softcapping": self.attn_logit_softcapping, + } + attn_output, attn_weights = attention_forward(config, query_states, key_states, value_states, attention_mask, output_attentions=output_attentions) + + attn_output = attn_output.reshape(bsz, q_len, -1).contiguous() + attn_output = self.o_proj(attn_output) + + if not output_attentions: + attn_weights = None + + return attn_output, attn_weights, past_key_value + +class Gemma2DecoderLayer(nn.Module): + def __init__(self, config, layer_idx, dtype=None, device=None, operations=None): + super().__init__() + self.hidden_size = config["hidden_size"] + self.config = config + self.is_sliding = not bool(layer_idx % 2) + + self.self_attn = Gemma2Attention(config=config, layer_idx=layer_idx, dtype=dtype, device=device, operations=operations) + self.mlp = Gemma2MLP(config, dtype=dtype, device=device, operations=operations) + + self.input_layernorm = Gemma2RMSNorm(config["hidden_size"], eps=config["rms_norm_eps"]) + self.post_attention_layernorm = Gemma2RMSNorm(config["hidden_size"], eps=config["rms_norm_eps"]) + + self.pre_feedforward_layernorm = Gemma2RMSNorm(config["hidden_size"], eps=config["rms_norm_eps"]) + self.post_feedforward_layernorm = Gemma2RMSNorm(config["hidden_size"], eps=config["rms_norm_eps"]) + self.sliding_window = config["sliding_window"] + + def forward(self, hidden_states, attention_mask=None, position_ids=None, past_key_value=None, output_attentions=False, use_cache=False, cache_position=None): + if self.is_sliding and attention_mask is not None: # efficient SDPA and no padding + # # Flash-attn is a 2D tensor + # if self.config["_attn_implementation == "flash_attention_2": + # if past_key_value is not None: # when decoding + # attention_mask = attention_mask[:, -self.sliding_window :] + # else: + min_dtype = torch.finfo(hidden_states.dtype).min + sliding_window_mask = torch.tril( + torch.ones_like(attention_mask, dtype=torch.bool), diagonal=-self.sliding_window + ) + attention_mask = torch.where(sliding_window_mask, min_dtype, attention_mask) + if attention_mask.shape[-1] <= 1: # when decoding + attention_mask = attention_mask[:, :, :, -self.sliding_window :] + + residual = hidden_states + hidden_states = self.input_layernorm(hidden_states) + + # Self Attention + hidden_states, self_attn_weights, present_key_value = self.self_attn( + hidden_states=hidden_states, + attention_mask=attention_mask, + position_ids=position_ids, + past_key_value=past_key_value, + output_attentions=output_attentions, + use_cache=use_cache, + cache_position=cache_position, + ) + hidden_states = self.post_attention_layernorm(hidden_states) + hidden_states = residual + hidden_states + + residual = hidden_states + hidden_states = self.pre_feedforward_layernorm(hidden_states) + hidden_states = self.mlp(hidden_states) + hidden_states = self.post_feedforward_layernorm(hidden_states) + hidden_states = residual + hidden_states + + outputs = (hidden_states,) + + if output_attentions: + outputs += (self_attn_weights,) + + if use_cache: + outputs += (present_key_value,) + + return outputs + +def prepare_causal_mask(input_tensor, attention_mask): + dtype, device = input_tensor.dtype, input_tensor.device + batch_size=input_tensor.shape[0] + sequence_length = input_tensor.shape[1] + target_length = attention_mask.shape[-1] if attention_mask is not None else input_tensor.shape[1] + + min_dtype = torch.finfo(dtype).min + causal_mask = torch.full( + (sequence_length, target_length), fill_value=min_dtype, dtype=dtype, device=device + ) + if sequence_length != 1: + causal_mask = torch.triu(causal_mask, diagonal=1) + #causal_mask *= torch.arange(target_length, device=device) > cache_position.reshape(-1, 1) + causal_mask = causal_mask[None, None, :, :].expand(batch_size, 1, -1, -1) + if attention_mask is not None: + causal_mask = causal_mask.clone() # copy to contiguous memory for in-place edit + mask_length = attention_mask.shape[-1] + padding_mask = causal_mask[:, :, :, :mask_length] + attention_mask[:, None, None, :] + padding_mask = padding_mask == 0 + causal_mask[:, :, :, :mask_length] = causal_mask[:, :, :, :mask_length].masked_fill( + padding_mask, min_dtype + ) + + return causal_mask + +class Gemma2Model(torch.nn.Module): + def __init__(self, config_dict, dtype, device, operations): + super().__init__() + self.padding_idx = 0 + self.hidden_size = config_dict["hidden_size"] + self.embed_tokens = operations.Embedding(config_dict["vocab_size"], self.hidden_size, self.padding_idx, device=device, dtype=dtype) + self.num_layers = config_dict["num_hidden_layers"] + self.layers = nn.ModuleList( + [Gemma2DecoderLayer(config_dict, layer_idx, dtype=dtype, device=device, operations=operations) for layer_idx in range(config_dict["num_hidden_layers"])] + ) + self.norm = Gemma2RMSNorm(self.hidden_size, eps=config_dict["rms_norm_eps"]) + + def get_input_embeddings(self): + return self.embed_tokens + + def set_input_embeddings(self, value): + self.embed_tokens = value + + def forward(self, input_ids=None, attention_mask=None, position_ids=None, intermediate_output=None, final_layer_norm_intermediate=False, *args, **kwargs): + inputs_embeds = self.embed_tokens(input_ids, out_dtype=kwargs.get("dtype", torch.float32)) + hidden_states = inputs_embeds + intermediate = None + + if attention_mask is not None and position_ids is None: + position_ids = attention_mask.long().cumsum(-1) - 1 + position_ids.masked_fill_(attention_mask == 0, 1) + + # normalized + # Gemma2 downcasts the below to float16, causing sqrt(3072)=55.4256 to become 55.5 + # See https://github.com/huggingface/transformers/pull/29402 + normalizer = torch.tensor(self.hidden_size**0.5, dtype=hidden_states.dtype) + hidden_states = hidden_states * normalizer + + causal_mask = prepare_causal_mask(inputs_embeds, attention_mask) + + if intermediate_output is not None: + if intermediate_output < 0: + intermediate_output = len(self.layers) + intermediate_output + + for i, decoder_layer in enumerate(self.layers): + layer_outputs = decoder_layer( + hidden_states, + attention_mask=causal_mask, + position_ids=position_ids, + past_key_value=None, + output_attentions=False, + use_cache=False, + ) + if i == intermediate_output: + intermediate = hidden_states.clone() + hidden_states = layer_outputs[0] + + hidden_states = self.norm(hidden_states) + if intermediate is not None and final_layer_norm_intermediate: + intermediate = self.norm(intermediate) + + return hidden_states, intermediate diff --git a/text_encoders/sana/tenc.py b/text_encoders/sana/tenc.py new file mode 100644 index 0000000..45da847 --- /dev/null +++ b/text_encoders/sana/tenc.py @@ -0,0 +1,46 @@ +import os +import torch + +from comfy import sd1_clip +import comfy.model_management +from .gemma import Gemma2Model + +from transformers import GemmaTokenizer as TFGemmaTokenizer + +class GemmaClipModel(sd1_clip.SDClipModel): + def __init__(self, device="cpu", dtype=None, model_options={}, **kwargs): + textmodel_json_config = os.path.join(os.path.dirname(os.path.realpath(__file__)), "config.json") + special_tokens = {"start": 2, "end": 1, "pad": 0} + super().__init__(device=device, layer="last", layer_idx=None, textmodel_json_config=textmodel_json_config, dtype=dtype, special_tokens=special_tokens, model_class=Gemma2Model, enable_attention_masks=True, return_attention_masks=False, model_options=model_options) + +class SanaClipModel(sd1_clip.SD1ClipModel): + def __init__(self, device="cpu", dtype=None, model_options={}): + super().__init__(device=device, dtype=dtype, name="gemma", clip_model=GemmaClipModel, model_options=model_options) + +class GemmaTokenizer(sd1_clip.SDTokenizer): + def __init__(self, embedding_directory=None, tokenizer_data={}): + tokenizer_path = os.path.join( + os.path.dirname(os.path.dirname(os.path.realpath(__file__))), + "tokenizers", "gemma_tokenizer", + ) + # TODO: reenable proper logic here - needs comfy version 44db978 or higher + super().__init__(tokenizer_path, embedding_directory=embedding_directory, pad_with_end=False, embedding_size=2304, embedding_key='gemma', tokenizer_class=TFGemmaTokenizer, has_start_token=False, pad_to_max_length=True, max_length=300, min_length=1) + self.start_token = 2 + self.end_token = 1 + self.pad_token = 0 + + def tokenize_with_weights(self, text, return_word_ids=False): + # TODO: see above, this is still just a wrapper for now + tokens = self.tokenizer( + text, + max_length=300, + padding="max_length", + truncation=True, + return_tensors="pt" + ) + batched_tokens = [(x.item(), 1.0) for x in tokens.input_ids[0]] + return [batched_tokens] + +class SanaTokenizer(sd1_clip.SD1Tokenizer): + def __init__(self, embedding_directory=None, tokenizer_data={}): + super().__init__(embedding_directory=embedding_directory, tokenizer_data=tokenizer_data, clip_name="gemma", tokenizer=GemmaTokenizer) diff --git a/text_encoders/tenc.py b/text_encoders/tenc.py new file mode 100644 index 0000000..8ec8acc --- /dev/null +++ b/text_encoders/tenc.py @@ -0,0 +1,81 @@ +import logging +from enum import Enum + +import comfy.sd +import comfy.utils +import comfy.text_encoders +import comfy.model_management + +from .pixart.tenc import pixart_te, PixArtTokenizer +from .sana.tenc import SanaClipModel, SanaTokenizer + +class TencType(Enum): + # offset in case we ever integrate w/ original + PixArt = 1001 + MiaoBi = 1002 + # HunYuan = 1003 # deprecated + Sana = 1004 + +tenc_names = { + # for node readout + "PixArt": TencType.PixArt, + "MiaoBi": TencType.MiaoBi, + # "HunYuan": TencType.HunYuan, + "Sana": TencType.Sana, +} + + +def load_text_encoder(ckpt_paths, embedding_directory=None, clip_type=TencType.PixArt, model_options={}): + # Partial duplicate of ComfyUI/comfy/sd:load_clip + clip_data = [] + for p in ckpt_paths: + if p.lower().endswith(".gguf"): + # TODO: cross-node call w/o code duplication + raise NotImplementedError("Planned!") + else: + clip_data.append(comfy.utils.load_torch_file(p, safe_load=True)) + return load_text_encoder_state_dicts(clip_data, embedding_directory=embedding_directory, clip_type=clip_type, model_options=model_options) + +def load_text_encoder_state_dicts(state_dicts=[], embedding_directory=None, clip_type=TencType.PixArt, model_options={}): + # Partial duplicate of ComfyUI/comfy/sd:load_text_encoder_state_dicts + clip_data = state_dicts + + class EmptyClass: + pass + + for i in range(len(clip_data)): + if "transformer.resblocks.0.ln_1.weight" in clip_data[i]: + clip_data[i] = comfy.utils.clip_text_transformers_convert(clip_data[i], "", "") + elif "model.layers.25.post_feedforward_layernorm.weight" in clip_data[i]: + clip_data[i] = {k[len("model."):]:v for k,v in clip_data[i].items()} + else: + if "text_projection" in clip_data[i]: + clip_data[i]["text_projection.weight"] = clip_data[i]["text_projection"].transpose(0, 1) #old models saved with the CLIPSave node + + clip_target = EmptyClass() + clip_target.params = {} + + if clip_type == TencType.PixArt: + clip_target.clip = pixart_te(**comfy.sd.t5xxl_detect(clip_data)) + clip_target.tokenizer = PixArtTokenizer + elif clip_type == TencType.Sana: + clip_target.clip = SanaClipModel + clip_target.tokenizer = SanaTokenizer + else: + raise NotImplementedError(f"Unknown tenc: {clip_type}") + + parameters = 0 + tokenizer_data = {} + for c in clip_data: + parameters += comfy.utils.calculate_parameters(c) + tokenizer_data, model_options = comfy.text_encoders.long_clipl.model_options_long_clip(c, tokenizer_data, model_options) + + clip = comfy.sd.CLIP(clip_target, embedding_directory=embedding_directory, parameters=parameters, tokenizer_data=tokenizer_data, model_options=model_options) + for c in clip_data: + m, u = clip.load_sd(c) + if len(m) > 0: + logging.warning("clip missing: {}".format(m)) + + if len(u) > 0: + logging.debug("clip unexpected: {}".format(u)) + return clip diff --git a/text_encoders/tokenizers/gemma_tokenizer/special_tokens_map.json b/text_encoders/tokenizers/gemma_tokenizer/special_tokens_map.json new file mode 100644 index 0000000..8d6368f --- /dev/null +++ b/text_encoders/tokenizers/gemma_tokenizer/special_tokens_map.json @@ -0,0 +1,34 @@ +{ + "additional_special_tokens": [ + "", + "" + ], + "bos_token": { + "content": "", + "lstrip": false, + "normalized": false, + "rstrip": false, + "single_word": false + }, + "eos_token": { + "content": "", + "lstrip": false, + "normalized": false, + "rstrip": false, + "single_word": false + }, + "pad_token": { + "content": "", + "lstrip": false, + "normalized": false, + "rstrip": false, + "single_word": false + }, + "unk_token": { + "content": "", + "lstrip": false, + "normalized": false, + "rstrip": false, + "single_word": false + } +} diff --git a/HunYuanDiT/mt5_tokenizer/spiece.model b/text_encoders/tokenizers/gemma_tokenizer/tokenizer.model similarity index 63% rename from HunYuanDiT/mt5_tokenizer/spiece.model rename to text_encoders/tokenizers/gemma_tokenizer/tokenizer.model index 26a2a78..14a2422 100644 Binary files a/HunYuanDiT/mt5_tokenizer/spiece.model and b/text_encoders/tokenizers/gemma_tokenizer/tokenizer.model differ diff --git a/text_encoders/tokenizers/gemma_tokenizer/tokenizer_config.json b/text_encoders/tokenizers/gemma_tokenizer/tokenizer_config.json new file mode 100644 index 0000000..f23acea --- /dev/null +++ b/text_encoders/tokenizers/gemma_tokenizer/tokenizer_config.json @@ -0,0 +1,1516 @@ +{ + "add_bos_token": true, + "add_eos_token": false, + "added_tokens_decoder": { + "0": { + "content": "", + "lstrip": false, + "normalized": false, + "rstrip": false, + "single_word": false, + "special": true + }, + "1": { + "content": "", + "lstrip": false, + "normalized": false, + "rstrip": false, + "single_word": false, + "special": true + }, + "2": { + "content": "", + "lstrip": false, + "normalized": false, + "rstrip": false, + "single_word": false, + "special": true + }, + "3": { + "content": "", + "lstrip": false, + "normalized": false, + "rstrip": false, + "single_word": false, + "special": true + }, + "4": { + "content": "", + "lstrip": false, + "normalized": false, + "rstrip": false, + "single_word": false, + "special": false + }, + "5": { + "content": "<2mass>", + "lstrip": false, + "normalized": false, + "rstrip": false, + "single_word": false, + "special": false + }, + "6": { + "content": "[@BOS@]", + "lstrip": false, + "normalized": false, + "rstrip": false, + "single_word": false, + "special": false + }, + "7": { + "content": "", + "lstrip": false, + "normalized": false, + "rstrip": false, + "single_word": false, + "special": false + }, + "8": { + "content": "", + "lstrip": false, + "normalized": false, + "rstrip": false, + "single_word": false, + "special": false + }, + "9": { + "content": "", + "lstrip": false, + "normalized": false, + "rstrip": false, + "single_word": false, + "special": false + }, + "10": { + "content": "", + "lstrip": false, + "normalized": false, + "rstrip": false, + "single_word": false, + "special": false + }, + "11": { + "content": "", + "lstrip": false, + "normalized": false, + "rstrip": false, + "single_word": false, + "special": false + }, + "12": { + "content": "", + "lstrip": false, + "normalized": false, + "rstrip": false, + "single_word": false, + "special": false + }, + "13": { + "content": "", + "lstrip": false, + "normalized": false, + "rstrip": false, + "single_word": false, + "special": false + }, + "14": { + "content": "", + "lstrip": false, + "normalized": false, + "rstrip": false, + "single_word": false, + "special": false + }, + "15": { + "content": "", + "lstrip": false, + "normalized": false, + "rstrip": false, + "single_word": false, + "special": false + }, + "16": { + "content": "", + "lstrip": false, + "normalized": false, + "rstrip": false, + "single_word": false, + "special": false + }, + "17": { + "content": "", + "lstrip": false, + "normalized": false, + "rstrip": false, + "single_word": false, + "special": false + }, + "18": { + "content": "", + "lstrip": false, + "normalized": false, + "rstrip": false, + "single_word": false, + "special": false + }, + "19": { + "content": "", + "lstrip": false, + "normalized": false, + "rstrip": false, + "single_word": false, + "special": false + }, + "20": { + "content": "", + "lstrip": false, + "normalized": false, + "rstrip": false, + "single_word": false, + "special": false + }, + "21": { + "content": "", + "lstrip": false, + "normalized": false, + "rstrip": false, + "single_word": false, + "special": false + }, + "22": { + "content": "", + "lstrip": false, + "normalized": false, + "rstrip": false, + "single_word": false, + "special": false + }, + "23": { + "content": "", + "lstrip": false, + "normalized": false, + "rstrip": false, + "single_word": false, + "special": false + }, + "24": { + "content": "", + "lstrip": false, + "normalized": false, + "rstrip": false, + "single_word": false, + "special": false + }, + "25": { + "content": "", + "lstrip": false, + "normalized": false, + "rstrip": false, + "single_word": false, + "special": false + }, + "26": { + "content": "", + "lstrip": false, + "normalized": false, + "rstrip": false, + "single_word": false, + "special": false + }, + "27": { + "content": "", + "lstrip": false, + "normalized": false, + "rstrip": false, + "single_word": false, + "special": false + }, + "28": { + "content": "", + "lstrip": false, + "normalized": false, + "rstrip": false, + "single_word": false, + "special": false + }, + "29": { + "content": "", + "lstrip": false, + "normalized": false, + "rstrip": false, + "single_word": false, + "special": false + }, + "30": { + "content": "", + "lstrip": false, + "normalized": false, + "rstrip": false, + "single_word": false, + "special": false + }, + "31": { + "content": "", + "lstrip": false, + "normalized": false, + "rstrip": false, + "single_word": false, + "special": false + }, + "32": { + "content": "", + "lstrip": false, + "normalized": false, + "rstrip": false, + "single_word": false, + "special": false + }, + "33": { + "content": "", + "lstrip": false, + "normalized": false, + "rstrip": false, + "single_word": false, + "special": false + }, + "34": { + "content": "", + "lstrip": false, + "normalized": false, + "rstrip": false, + "single_word": false, + "special": false + }, + "35": { + "content": "", + "lstrip": false, + "normalized": false, + "rstrip": false, + "single_word": false, + "special": false + }, + "36": { + "content": "", + "lstrip": false, + "normalized": false, + "rstrip": false, + "single_word": false, + "special": false + }, + "37": { + "content": "", + "lstrip": false, + "normalized": false, + "rstrip": false, + "single_word": false, + "special": false + }, + "38": { + "content": "", + "lstrip": false, + "normalized": false, + "rstrip": false, + "single_word": false, + "special": false + }, + "39": { + "content": "", + "lstrip": false, + "normalized": false, + "rstrip": false, + "single_word": false, + "special": false + }, + "40": { + "content": "", + "lstrip": false, + "normalized": false, + "rstrip": false, + "single_word": false, + "special": false + }, + "41": { + "content": "", + "lstrip": false, + "normalized": false, + "rstrip": false, + "single_word": false, + "special": false + }, + "42": { + "content": "", + "lstrip": false, + "normalized": false, + "rstrip": false, + "single_word": false, + "special": false + }, + "43": { + "content": "", + "lstrip": false, + "normalized": false, + "rstrip": false, + "single_word": false, + "special": false + }, + "44": { + "content": "", + "lstrip": false, + "normalized": false, + "rstrip": false, + "single_word": false, + "special": false + }, + "45": { + "content": "", + "lstrip": false, + "normalized": false, + "rstrip": false, + "single_word": false, + "special": false + }, + "46": { + "content": "", + "lstrip": false, + "normalized": false, + "rstrip": false, + "single_word": false, + "special": false + }, + "47": { + "content": "", + "lstrip": false, + "normalized": false, + "rstrip": false, + "single_word": false, + "special": false + }, + "48": { + "content": "", + "lstrip": false, + "normalized": false, + "rstrip": false, + "single_word": false, + "special": false + }, + "49": { + "content": "", + "lstrip": false, + "normalized": false, + "rstrip": false, + "single_word": false, + "special": false + }, + "50": { + "content": "", + "lstrip": false, + "normalized": false, + "rstrip": false, + "single_word": false, + "special": false + }, + "51": { + "content": "", + "lstrip": false, + "normalized": false, + "rstrip": false, + "single_word": false, + "special": false + }, + "52": { + "content": "", + "lstrip": false, + "normalized": false, + "rstrip": false, + "single_word": false, + "special": false + }, + "53": { + "content": "", + "lstrip": false, + "normalized": false, + "rstrip": false, + "single_word": false, + "special": false + }, + "54": { + "content": "", + "lstrip": false, + "normalized": false, + "rstrip": false, + "single_word": false, + "special": false + }, + "55": { + "content": "", + "lstrip": false, + "normalized": false, + "rstrip": false, + "single_word": false, + "special": false + }, + "56": { + "content": "", + "lstrip": false, + "normalized": false, + "rstrip": false, + "single_word": false, + "special": false + }, + "57": { + "content": "", + "lstrip": false, + "normalized": false, + "rstrip": false, + "single_word": false, + "special": false + }, + "58": { + "content": "", + "lstrip": false, + "normalized": false, + "rstrip": false, + "single_word": false, + "special": false + }, + "59": { + "content": "", + "lstrip": false, + "normalized": false, + "rstrip": false, + "single_word": false, + "special": false + }, + "60": { + "content": "", + "lstrip": false, + "normalized": false, + "rstrip": false, + "single_word": false, + "special": false + }, + "61": { + "content": "", + "lstrip": false, + "normalized": false, + "rstrip": false, + "single_word": false, + "special": false + }, + "62": { + "content": "", + "lstrip": false, + "normalized": false, + "rstrip": false, + "single_word": false, + "special": false + }, + "63": { + "content": "", + "lstrip": false, + "normalized": false, + "rstrip": false, + "single_word": false, + "special": false + }, + "64": { + "content": "", + "lstrip": false, + "normalized": false, + "rstrip": false, + "single_word": false, + "special": false + }, + "65": { + "content": "", + "lstrip": false, + "normalized": false, + "rstrip": false, + "single_word": false, + "special": false + }, + "66": { + "content": "", + "lstrip": false, + "normalized": false, + "rstrip": false, + "single_word": false, + "special": false + }, + "67": { + "content": "", + "lstrip": false, + "normalized": false, + "rstrip": false, + "single_word": false, + "special": false + }, + "68": { + "content": "", + "lstrip": false, + "normalized": false, + "rstrip": false, + "single_word": false, + "special": false + }, + "69": { + "content": "", + "lstrip": false, + "normalized": false, + "rstrip": false, + "single_word": false, + "special": false + }, + "70": { + "content": "", + "lstrip": false, + "normalized": false, + "rstrip": false, + "single_word": false, + "special": false + }, + "71": { + "content": "", + "lstrip": false, + "normalized": false, + "rstrip": false, + "single_word": false, + "special": false + }, + "72": { + "content": "", + "lstrip": false, + "normalized": false, + "rstrip": false, + "single_word": false, + "special": false + }, + "73": { + "content": "", + "lstrip": false, + "normalized": false, + "rstrip": false, + "single_word": false, + "special": false + }, + "74": { + "content": "", + "lstrip": false, + "normalized": false, + "rstrip": false, + "single_word": false, + "special": false + }, + "75": { + "content": "", + "lstrip": false, + "normalized": false, + "rstrip": false, + "single_word": false, + "special": false + }, + "76": { + "content": "", + "lstrip": false, + "normalized": false, + "rstrip": false, + "single_word": false, + "special": false + }, + "77": { + "content": "", + "lstrip": false, + "normalized": false, + "rstrip": false, + "single_word": false, + "special": false + }, + "78": { + "content": "", + "lstrip": false, + "normalized": false, + "rstrip": false, + "single_word": false, + "special": false + }, + "79": { + "content": "", + "lstrip": false, + "normalized": false, + "rstrip": false, + "single_word": false, + "special": false + }, + "80": { + "content": "", + "lstrip": false, + "normalized": false, + "rstrip": false, + "single_word": false, + "special": false + }, + "81": { + "content": "", + "lstrip": false, + "normalized": false, + "rstrip": false, + "single_word": false, + "special": false + }, + "82": { + "content": "", + "lstrip": false, + "normalized": false, + "rstrip": false, + "single_word": false, + "special": false + }, + "83": { + "content": "", + "lstrip": false, + "normalized": false, + "rstrip": false, + "single_word": false, + "special": false + }, + "84": { + "content": "", + "lstrip": false, + "normalized": false, + "rstrip": false, + "single_word": false, + "special": false + }, + "85": { + "content": "", + "lstrip": false, + "normalized": false, + "rstrip": false, + "single_word": false, + "special": false + }, + "86": { + "content": "", + "lstrip": false, + "normalized": false, + "rstrip": false, + "single_word": false, + "special": false + }, + "87": { + "content": "", + "lstrip": false, + "normalized": false, + "rstrip": false, + "single_word": false, + "special": false + }, + "88": { + "content": "", + "lstrip": false, + "normalized": false, + "rstrip": false, + "single_word": false, + "special": false + }, + "89": { + "content": "", + "lstrip": false, + "normalized": false, + "rstrip": false, + "single_word": false, + "special": false + }, + "90": { + "content": "", + "lstrip": false, + "normalized": false, + "rstrip": false, + "single_word": false, + "special": false + }, + "91": { + "content": "", + "lstrip": false, + "normalized": false, + "rstrip": false, + "single_word": false, + "special": false + }, + "92": { + "content": "", + "lstrip": false, + "normalized": false, + "rstrip": false, + "single_word": false, + "special": false + }, + "93": { + "content": "", + "lstrip": false, + "normalized": false, + "rstrip": false, + "single_word": false, + "special": false + }, + "94": { + "content": "", + "lstrip": false, + "normalized": false, + "rstrip": false, + "single_word": false, + "special": false + }, + "95": { + "content": "", + "lstrip": false, + "normalized": false, + "rstrip": false, + "single_word": false, + "special": false + }, + "96": { + "content": "", + "lstrip": false, + "normalized": false, + "rstrip": false, + "single_word": false, + "special": false + }, + "97": { + "content": "", + "lstrip": false, + "normalized": false, + "rstrip": false, + "single_word": false, + "special": false + }, + "98": { + "content": "", + "lstrip": false, + "normalized": false, + "rstrip": false, + "single_word": false, + "special": false + }, + "99": { + "content": "", + "lstrip": false, + "normalized": false, + "rstrip": false, + "single_word": false, + "special": false + }, + "100": { + "content": "", + "lstrip": false, + "normalized": false, + "rstrip": false, + "single_word": false, + "special": false + }, + "101": { + "content": "", + "lstrip": false, + "normalized": false, + "rstrip": false, + "single_word": false, + "special": false + }, + "102": { + "content": "", + "lstrip": false, + "normalized": false, + "rstrip": false, + "single_word": false, + "special": false + }, + "103": { + "content": "", + "lstrip": false, + "normalized": false, + "rstrip": false, + "single_word": false, + "special": false + }, + "104": { + "content": "", + "lstrip": false, + "normalized": false, + "rstrip": false, + "single_word": false, + "special": false + }, + "105": { + "content": "", + "lstrip": false, + "normalized": false, + "rstrip": false, + "single_word": false, + "special": false + }, + "106": { + "content": "", + "lstrip": false, + "normalized": false, + "rstrip": false, + "single_word": false, + "special": true + }, + "107": { + "content": "", + "lstrip": false, + "normalized": false, + "rstrip": false, + "single_word": false, + "special": true + }, + "108": { + "content": "\n", + "lstrip": false, + "normalized": false, + "rstrip": false, + "single_word": false, + "special": false + }, + "109": { + "content": "\n\n", + "lstrip": false, + "normalized": false, + "rstrip": false, + "single_word": false, + "special": false + }, + "110": { + "content": "\n\n\n", + "lstrip": false, + "normalized": false, + "rstrip": false, + "single_word": false, + "special": false + }, + "111": { + "content": "\n\n\n\n", + "lstrip": false, + "normalized": false, + "rstrip": false, + "single_word": false, + "special": false + }, + "112": { + "content": "\n\n\n\n\n", + "lstrip": false, + "normalized": false, + "rstrip": false, + "single_word": false, + "special": false + }, + "113": { + "content": "\n\n\n\n\n\n", + "lstrip": false, + "normalized": false, + "rstrip": false, + "single_word": false, + "special": false + }, + "114": { + "content": "\n\n\n\n\n\n\n", + "lstrip": false, + "normalized": false, + "rstrip": false, + "single_word": false, + "special": false + }, + "115": { + "content": "\n\n\n\n\n\n\n\n", + "lstrip": false, + "normalized": false, + "rstrip": false, + "single_word": false, + "special": false + }, + "116": { + "content": "\n\n\n\n\n\n\n\n\n", + "lstrip": false, + "normalized": false, + "rstrip": false, + "single_word": false, + "special": false + }, + "117": { + "content": "\n\n\n\n\n\n\n\n\n\n", + "lstrip": false, + "normalized": false, + "rstrip": false, + "single_word": false, + "special": false + }, + "118": { + "content": "\n\n\n\n\n\n\n\n\n\n\n", + "lstrip": false, + "normalized": false, + "rstrip": false, + "single_word": false, + "special": false + }, + "119": { + "content": "\n\n\n\n\n\n\n\n\n\n\n\n", + "lstrip": false, + "normalized": false, + "rstrip": false, + "single_word": false, + "special": false + }, + "120": { + "content": "\n\n\n\n\n\n\n\n\n\n\n\n\n", + "lstrip": false, + "normalized": false, + "rstrip": false, + "single_word": false, + "special": false + }, + "121": { + "content": "\n\n\n\n\n\n\n\n\n\n\n\n\n\n", + "lstrip": false, + "normalized": false, + "rstrip": false, + "single_word": false, + "special": false + }, + "122": { + "content": "\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n", + "lstrip": false, + "normalized": false, + "rstrip": false, + "single_word": false, + "special": false + }, + "123": { + "content": "\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n", + "lstrip": false, + "normalized": false, + "rstrip": false, + "single_word": false, + "special": false + }, + "124": { + "content": "\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n", + "lstrip": false, + "normalized": false, + "rstrip": false, + "single_word": false, + "special": false + }, + "125": { + "content": "\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n", + "lstrip": false, + "normalized": false, + "rstrip": false, + "single_word": false, + "special": false + }, + "126": { + "content": "\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n", + "lstrip": false, + "normalized": false, + "rstrip": false, + "single_word": false, + "special": false + }, + "127": { + "content": "\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n", + "lstrip": false, + "normalized": false, + "rstrip": false, + "single_word": false, + "special": false + }, + "128": { + "content": "\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n", + "lstrip": false, + "normalized": false, + "rstrip": false, + "single_word": false, + "special": false + }, + "129": { + "content": "\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n", + "lstrip": false, + "normalized": false, + "rstrip": false, + "single_word": false, + "special": false + }, + "130": { + "content": "\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n", + "lstrip": false, + "normalized": false, + "rstrip": false, + "single_word": false, + "special": false + }, + "131": { + "content": "\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n", + "lstrip": false, + "normalized": false, + "rstrip": false, + "single_word": false, + "special": false + }, + "132": { + "content": "\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n", + "lstrip": false, + "normalized": false, + "rstrip": false, + "single_word": false, + "special": false + }, + "133": { + "content": "\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n", + "lstrip": false, + "normalized": false, + "rstrip": false, + "single_word": false, + "special": false + }, + "134": { + "content": "\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n", + "lstrip": false, + "normalized": false, + "rstrip": false, + "single_word": false, + "special": false + }, + "135": { + "content": "\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n", + "lstrip": false, + "normalized": false, + "rstrip": false, + "single_word": false, + "special": false + }, + "136": { + "content": "\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n", + "lstrip": false, + "normalized": false, + "rstrip": false, + "single_word": false, + "special": false + }, + "137": { + "content": "\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n", + "lstrip": false, + "normalized": false, + "rstrip": false, + "single_word": false, + "special": false + }, + "138": { + "content": "\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n\n", + "lstrip": false, + "normalized": false, + "rstrip": false, + "single_word": false, + "special": false + }, + "169": { + "content": "", + "lstrip": false, + "normalized": false, + "rstrip": false, + "single_word": false, + "special": false + }, + "170": { + "content": "", + "lstrip": false, + "normalized": false, + "rstrip": false, + "single_word": false, + "special": false + }, + "172": { + "content": "", + "lstrip": false, + "normalized": false, + "rstrip": false, + "single_word": false, + "special": false + }, + "173": { + "content": "", + "lstrip": false, + "normalized": false, + "rstrip": false, + "single_word": false, + "special": false + }, + "174": { + "content": "", + "lstrip": false, + "normalized": false, + "rstrip": false, + "single_word": false, + "special": false + }, + "175": { + "content": "
", + "lstrip": false, + "normalized": false, + "rstrip": false, + "single_word": false, + "special": false + }, + "171": { + "content": "
", + "lstrip": false, + "normalized": false, + "rstrip": false, + "single_word": false, + "special": false + }, + "176": { + "content": "", + "lstrip": false, + "normalized": false, + "rstrip": false, + "single_word": false, + "special": false + }, + "177": { + "content": "
", + "lstrip": false, + "normalized": false, + "rstrip": false, + "single_word": false, + "special": false + }, + "178": { + "content": "", + "lstrip": false, + "normalized": false, + "rstrip": false, + "single_word": false, + "special": false + }, + "179": { + "content": "", + "lstrip": false, + "normalized": false, + "rstrip": false, + "single_word": false, + "special": false + }, + "180": { + "content": "", + "lstrip": false, + "normalized": false, + "rstrip": false, + "single_word": false, + "special": false + }, + "181": { + "content": "", + "lstrip": false, + "normalized": false, + "rstrip": false, + "single_word": false, + "special": false + }, + "182": { + "content": "", + "lstrip": false, + "normalized": false, + "rstrip": false, + "single_word": false, + "special": false + }, + "183": { + "content": "", + "lstrip": false, + "normalized": false, + "rstrip": false, + "single_word": false, + "special": false + }, + "184": { + "content": "", + "lstrip": false, + "normalized": false, + "rstrip": false, + "single_word": false, + "special": false + }, + "185": { + "content": "

", + "lstrip": false, + "normalized": false, + "rstrip": false, + "single_word": false, + "special": false + }, + "186": { + "content": "

", + "lstrip": false, + "normalized": false, + "rstrip": false, + "single_word": false, + "special": false + }, + "187": { + "content": "

", + "lstrip": false, + "normalized": false, + "rstrip": false, + "single_word": false, + "special": false + }, + "188": { + "content": "

", + "lstrip": false, + "normalized": false, + "rstrip": false, + "single_word": false, + "special": false + }, + "189": { + "content": "

", + "lstrip": false, + "normalized": false, + "rstrip": false, + "single_word": false, + "special": false + }, + "190": { + "content": "
", + "lstrip": false, + "normalized": false, + "rstrip": false, + "single_word": false, + "special": false + }, + "191": { + "content": "
", + "lstrip": false, + "normalized": false, + "rstrip": false, + "single_word": false, + "special": false + }, + "192": { + "content": "
", + "lstrip": false, + "normalized": false, + "rstrip": false, + "single_word": false, + "special": false + }, + "193": { + "content": "", + "lstrip": false, + "normalized": false, + "rstrip": false, + "single_word": false, + "special": false + }, + "194": { + "content": "", + "lstrip": false, + "normalized": false, + "rstrip": false, + "single_word": false, + "special": false + }, + "195": { + "content": "", + "lstrip": false, + "normalized": false, + "rstrip": false, + "single_word": false, + "special": false + }, + "196": { + "content": "", + "lstrip": false, + "normalized": false, + "rstrip": false, + "single_word": false, + "special": false + }, + "197": { + "content": "", + "lstrip": false, + "normalized": false, + "rstrip": false, + "single_word": false, + "special": false + }, + "198": { + "content": "", + "lstrip": false, + "normalized": false, + "rstrip": false, + "single_word": false, + "special": false + }, + "199": { + "content": "", + "lstrip": false, + "normalized": false, + "rstrip": false, + "single_word": false, + "special": false + }, + "200": { + "content": "", + "lstrip": false, + "normalized": false, + "rstrip": false, + "single_word": false, + "special": false + }, + "201": { + "content": "", + "lstrip": false, + "normalized": false, + "rstrip": false, + "single_word": false, + "special": false + }, + "202": { + "content": "", + "lstrip": false, + "normalized": false, + "rstrip": false, + "single_word": false, + "special": false + }, + "203": { + "content": "", + "lstrip": false, + "normalized": false, + "rstrip": false, + "single_word": false, + "special": false + }, + "204": { + "content": "", + "lstrip": false, + "normalized": false, + "rstrip": false, + "single_word": false, + "special": false + }, + "205": { + "content": "", + "lstrip": false, + "normalized": false, + "rstrip": false, + "single_word": false, + "special": false + }, + "206": { + "content": "", + "lstrip": false, + "normalized": false, + "rstrip": false, + "single_word": false, + "special": false + }, + "207": { + "content": "", + "lstrip": false, + "normalized": false, + "rstrip": false, + "single_word": false, + "special": false + }, + "208": { + "content": "", + "lstrip": false, + "normalized": false, + "rstrip": false, + "single_word": false, + "special": false + }, + "209": { + "content": "", + "lstrip": false, + "normalized": false, + "rstrip": false, + "single_word": false, + "special": false + }, + "210": { + "content": "", + "lstrip": false, + "normalized": false, + "rstrip": false, + "single_word": false, + "special": false + }, + "211": { + "content": "", + "lstrip": false, + "normalized": false, + "rstrip": false, + "single_word": false, + "special": false + }, + "212": { + "content": "", + "lstrip": false, + "normalized": false, + "rstrip": false, + "single_word": false, + "special": false + }, + "213": { + "content": "", + "lstrip": false, + "normalized": false, + "rstrip": false, + "single_word": false, + "special": false + }, + "214": { + "content": "", + "lstrip": false, + "normalized": false, + "rstrip": false, + "single_word": false, + "special": false + }, + "215": { + "content": "", + "lstrip": false, + "normalized": false, + "rstrip": false, + "single_word": false, + "special": false + }, + "216": { + "content": "", + "lstrip": false, + "normalized": false, + "rstrip": false, + "single_word": false, + "special": false + } + }, + "additional_special_tokens": [ + "", + "" + ], + "bos_token": "", + "clean_up_tokenization_spaces": false, + "eos_token": "", + "model_max_length": 1000000000000000019884624838656, + "pad_token": "", + "sp_model_kwargs": {}, + "spaces_between_special_tokens": false, + "tokenizer_class": "GemmaTokenizer", + "unk_token": "", + "use_default_system_prompt": false +} diff --git a/T5/t5_tokenizer/special_tokens_map.json b/text_encoders/tokenizers/t5_tokenizer/special_tokens_map.json similarity index 100% rename from T5/t5_tokenizer/special_tokens_map.json rename to text_encoders/tokenizers/t5_tokenizer/special_tokens_map.json diff --git a/T5/t5_tokenizer/spiece.model b/text_encoders/tokenizers/t5_tokenizer/spiece.model similarity index 100% rename from T5/t5_tokenizer/spiece.model rename to text_encoders/tokenizers/t5_tokenizer/spiece.model diff --git a/T5/t5_tokenizer/tokenizer_config.json b/text_encoders/tokenizers/t5_tokenizer/tokenizer_config.json similarity index 100% rename from T5/t5_tokenizer/tokenizer_config.json rename to text_encoders/tokenizers/t5_tokenizer/tokenizer_config.json diff --git a/utils/loader.py b/utils/loader.py new file mode 100644 index 0000000..f3f1b77 --- /dev/null +++ b/utils/loader.py @@ -0,0 +1,35 @@ +import logging +import comfy.utils +import comfy.model_patcher +from comfy import model_management + +def load_state_dict_from_config(model_config, sd, model_options={}): + parameters = comfy.utils.calculate_parameters(sd) + load_device = model_management.get_torch_device() + offload_device = comfy.model_management.unet_offload_device() + + dtype = model_options.get("dtype", None) + weight_dtype = comfy.utils.weight_dtype(sd) + unet_weight_dtype = list(model_config.supported_inference_dtypes) + + if weight_dtype is not None and model_config.scaled_fp8 is None: + unet_weight_dtype.append(weight_dtype) + + if dtype is None: + unet_dtype = model_management.unet_dtype(model_params=parameters, supported_dtypes=unet_weight_dtype) + else: + unet_dtype = dtype + + manual_cast_dtype = model_management.unet_manual_cast(unet_dtype, load_device, model_config.supported_inference_dtypes) + model_config.set_inference_dtype(unet_dtype, manual_cast_dtype) + model_config.custom_operations = model_options.get("custom_operations", model_config.custom_operations) + if model_options.get("fp8_optimizations", False): + model_config.optimizations["fp8"] = True + + model = model_config.get_model(sd, "") + model = model.to(offload_device).eval() + model.load_model_weights(sd, "") + left_over = sd.keys() + if len(left_over) > 0: + logging.info("left over keys in unet: {}".format(left_over)) + return comfy.model_patcher.ModelPatcher(model, load_device=load_device, offload_device=offload_device)