But what if there is a input with a massive size? I would want to partition out the string into several portion depending on the size threshold which can be set in the method and the hash the partitioned texts into a hashmap. Threading is possible in this context because the texts will be processed synchronously instead of sequentially.
You may test this code with this sample method calls:
public static StringBuffer removeSpace(String input){
return removeSpace(input, "%20");
}
public static StringBuffer removeSpace(String input, String fillerString){
StringBuffer sb = new StringBuffer();
int strLength = input.length();
int index = 0;
while(index != (strLength)){
if (input.charAt(index) != ' ') {
sb.append(input.charAt(index));
} else {
sb.append(fillerString);
}
index++;
}
return sb;
}
System.out.println(removeSpace("Hello World, I am Nicholas Key"));
System.out.println(removeSpace("Hello World, I am Nicholas Key", "[SPACE]"));
0 comments:
Post a Comment